Jump to content

DO-178C

From Wikipedia, the free encyclopedia
(Redirected from Design Assurance Level)

Software Considerations in Airborne Systems and Equipment Certification
Abbreviation
  • DO-178C
  • ED-12C
Latest version5 January 2012 (2012-01-05)
Organization
PredecessorDO-178B
DomainAviation

DO-178C, Software Considerations in Airborne Systems and Equipment Certification is the primary document by which the certification authorities such as FAA, EASA and Transport Canada approve all commercial software-based aerospace systems. The document is published by RTCA, Incorporated, in a joint effort with EUROC and replaces DO-178B. The new document is called DO-178C/ED-12C and was completed in November 2011 and approved by the RTCA in December 2011. It became available for sale and use in January 2012.[1][2][3]

Except for FAR 33/JAR E, the Federal Aviation Regulations do not directly reference software airworthiness.[4] On 19 Jul 2013, the FAA approved AC 20-115C, designating DO-178C a recognized "acceptable means, but not the only means, for showing compliance with the applicable FAR airworthiness regulations for the software aspects of airborne systems and equipment certification."[5]

Background

[edit]

Since the release of DO-178B, there had been strong calls by FAA Designated Engineering Representatives (DERs) for clarification/refinement of the definitions and boundaries between the key DO-178B concepts of high-level requirements, low-level requirements, and derived requirements and a better definition of the exit/entry criteria between systems requirements and system design (see ARP4754) and that of software requirements and software design (which is the domain of DO-178B). Other concerns included the meaning of verification in a model-based development paradigm and considerations for replacing some or all software testing activities with model simulation or formal methods. The release of DO-178C and the companion documents DO-278A (Ground Systems), DO-248C (Additional information with rationale for each DO-178C objective), DO-330 (Tool Qualification), DO-331 (Modeling), DO-332 (Object Oriented), and DO-333 (Formal Methods) were created to address the issues noted. The SC-205 members worked with the SAE S-18 committee to ensure that ARP4754A and the above noted DO-xxx documents provide a unified and linked process with complementary criteria.

Overall, DO-178C keeps most of the DO-178B text, which has raised concerns that issues with DO-178B, such as the ambiguity about the concept of low-level requirements, may not be fully resolved.[6]

Committee organization

[edit]

The RTCA/EUROCAE joint committee work was divided into seven Subgroups:

  • SG1: SCWG Document Integration
  • SG2: Issues and Rationale
  • SG3: Tool Qualification
  • SG4: Model Based Development and Verification
  • SG5: Object-Oriented Technology
  • SG6: Formal Methods
  • SG7: Safety Related Considerations

The Model Based Development and Verification subgroup (SG4) was the largest of the working groups. All work is collected and coordinated via a web-site that is a collaborative work management mechanism.[7] Working artifacts and draft documents were held in a restricted area available to group members only.

The work was focused on bringing DO-178B/ED-12B up to date with respect to current software development practices, tools, and technologies.[8][9]

Software level

[edit]

The Software Level, also known as the Development Assurance Level (DAL) or Item Development Assurance Level (IDAL) as defined in ARP4754 (DO-178C only mentions IDAL as synonymous with Software Level[10]), is determined from the safety assessment process and hazard analysis by examining the effects of a failure condition in the system. The failure conditions are categorized by their effects on the aircraft, crew, and passengers.

  • Catastrophic - Failure may cause deaths, usually with loss of the aircraft.
  • Hazardous - Failure has a large negative impact on safety or performance, or reduces the ability of the crew to operate the aircraft due to physical distress or a higher workload, or causes serious or fatal injuries among the passengers.
  • Major - Failure significantly reduces the safety margin or significantly increases crew workload. May result in passenger discomfort (or even minor injuries).
  • Minor - Failure slightly reduces the safety margin or slightly increases crew workload. Examples might include causing passenger inconvenience or a routine flight plan change.
  • No Effect - Failure has no impact on safety, aircraft operation, or crew workload.

DO-178C alone is not intended to guarantee software safety aspects. Safety attributes in the design and as implemented as functionality must receive additional mandatory system safety tasks to drive and show objective evidence of meeting explicit safety requirements. The certification authorities require and DO-178C specifies the correct DAL be established using these comprehensive analyses methods to establish the software level A-E. "The software level establishes the rigor necessary to demonstrate compliance" with DO-178C.[10] Any software that commands, controls, and monitors safety-critical functions should receive the highest DAL - Level A.

The number of objectives to be satisfied (some with independence) is determined by the software level A-E. The phrase "with independence" refers to a separation of responsibilities where the objectivity of the verification and validation processes is ensured by virtue of their "independence" from the software development team. For objectives that must be satisfied with independence, the person verifying the item (such as a requirement or source code) may not be the person who authored the item and this separation must be clearly documented.[11]

Level Failure condition Objectives[12] With independence
A Catastrophic 71 30
B Hazardous 69 18
C Major 62 5
D Minor 26 2
E No Safety Effect 0 0

Processes and documents

[edit]

Processes are intended to support the objectives, according to the software level (A through D—Level E was outside the purview of DO-178C). Processes are described as abstract areas of work in DO-178C, and it is up to the planners of a real project to define and document the specifics of how a process will be carried out. On a real project, the actual activities that will be done in the context of a process must be shown to support the objectives. These activities are defined by the project planners as part of the Planning process.

This objective-based nature of DO-178C allows a great deal of flexibility in regard to following different styles of software life cycle. Once an activity within a process has been defined, it is generally expected that the project respect that documented activity within its process. Furthermore, processes (and their concrete activities) must have well defined entry and exit criteria, according to DO-178C, and a project must show that it is respecting those criteria as it performs the activities in the process.

The flexible nature of DO-178C's processes and entry/exit criteria make it difficult to implement the first time, because these aspects are abstract and there is no "base set" of activities from which to work. The intention of DO-178C was not to be prescriptive. There are many possible and acceptable ways for a real project to define these aspects. This can be difficult the first time a company attempts to develop a civil avionics system under this standard, and has created a niche market for DO-178C training and consulting.

For a generic DO-178C based process, Stages of Involvements (SOI) are the minimum gates that a Certification Authority gets involved in reviewing a system or sub-system as defined by EASA on the Certification Memorandum SWCEH – 002: SW Approval Guidelines and FAA on the Order 8110.49: SW Approval Guidelines.

Traceability

[edit]
Diagram illustrating the required bidirectional tracing between certification artifacts, as required by the RTCA DO-178C standard. Thin blue-colored traces and blue-filled boxes are required only for Level A. Purple-colored traces and purple-filled boxes are required for Levels A, B, and C. Thick green-colored traces and green-filled boxes are for Levels A, B, C, and D. Level E does not require any tracing. The references on each trace arrow represent references to the standard for the objective, the activity, and the review/verification, respectively.

DO-178 requires documented bidirectional connections (called traces) between the certification artifacts. For example, a Low Level Requirement (LLR) is traced up to a High Level Requirement (HLR) it is meant to satisfy, while it is also traced to the lines of source code meant to implement it, the test cases meant to verify the correctness of the source code with respect to the requirement, the results of those tests, etc. A traceability analysis is then used to ensure that each requirement is fulfilled by the source code, that each functional requirement is verified by test, that each line of source code has a purpose (is connected to a requirement), and so forth. Traceability analysis accesses the system's completeness. The rigor and detail of the certification artifacts is related to the software level.

Differences with DO-178B

[edit]

SC-205/WG-12 was responsible for revising DO-178B/ED-12B to bring it up to date with respect to current software development and verification technologies. The structure of the document remains largely the same from B to C. Example changes include:[13]

  • Provide clearer language and terminology, provide more consistency
  • More objectives (for Levels A, B, and C)
  • Clarified the "hidden objective", applicable to Level A, which was implied by DO-178B in section 6.4.4.2b but not listed in the Annex A tables. This objective is now explicitly listed in DO-178C, Annex A, Table A-7, Objective 9: "Verification of additional code, that cannot be traced to Source Code, is achieved."[14]
  • Parameter Data Item Files - Provides separate information that influences the behavior of an executable object code (without changing it). An example would be a configuration file that sets up the schedule and major time frames of a partitioned operating system. The parameter data item file must be verified together with the executable object code, or else it must be tested for all possible ranges of the parameter data items.
  • DO-330 "Software Tool Qualification Considerations", a new "domain independent, external document", was developed to provide guidance for an acceptable tool qualification process. While DO-178B was used as the basis of the development of this new document, the text was adapted to be directly and separately applicable to tool development and expanded to address all tool aspects. As a domain-independent, stand-alone document, DO-330 is intended for use not only in support of DO-178C/ED-12C, but DO-278/ED-109, DO-254/ED-80, and DO-200 as well, even for non-aviation applications, e.g., ISO 26262 or ECSS.[15] Consequently, tool qualification guidance was removed in DO-178C, replaced therein with guidance for deciding when to apply DO-330 tool qualification guidance to tools used in a DO-178C context.[16]
  • Technology supplements were added to extend the guidance of the DO-178C document to specific techniques. Rather than expanding the prior text to account for all current and future software development techniques, supplements are made available to explicitly add, delete, or otherwise modify the guidance of the core standard for application to specific techniques or technologies. All guidance in these supplements are written in the context of the affected guidance elements in DO-178C and so should be considered as at the same level of authority as that core document.[17]
    • DO-331 "Model-Based Development and Verification Supplement to DO-178C and DO-278A" - addressing Model-Based Development (MBD) and verification and the ability to use modeling techniques to improve development and verification while avoiding pitfalls inherent in some modeling methods
    • DO-332 "Object-Oriented Technology and Related Techniques Supplement to DO-178C and DO-278A" - addressing object-oriented software and the conditions under which it may be used
    • DO-333 "Formal Methods Supplement to DO-178C and DO-278A" - addressing formal methods to complement (but not replace) testing

Guidelines vs. guidance

[edit]

DO-178B was not completely consistent in the use of the terms guidelines and guidance within the text. "Guidance" conveys a slightly stronger sense of obligation than "guidelines". As such, with the DO-178C, the SCWG has settled on the use of "guidance" for all the statements that are considered as "recommendations", replacing the remaining instances of "guidelines" with "supporting information" and using that phrase wherever the text is more "information" oriented than "recommendation" oriented.

The entire DO-248C/ED-94C document, Supporting Information for DO-178C and DO-278A, falls into the "supporting information" category, not guidance.[18]

Sample text difference between DO-178B and DO-178C

[edit]

Chapter 6.1 defines the purpose for the software verification process. DO-178C adds the following statement about the Executable Object Code:

  • "The Executable Object Code satisfies the software requirements (that is, intended function), and provides confidence in the absence of unintended functionality."
  • "The Executable Object Code is robust with respect to the software requirements that it can respond correctly to abnormal inputs and conditions."

As a comparison, DO-178B states the following with regard to the Executable Object Code:

  • "The Executable Object Code satisfies the software requirements."

The additional Revision C clarification filled a gap that a software developer could have encountered when interpreting the Revision B document.[19]

See also

[edit]

References

[edit]
  1. ^ Timberlake Membership Software, 703-591-4232. "Rtca, Inc". Rtca.org. Retrieved 7 August 2016.{{cite web}}: CS1 maint: numeric names: authors list (link)
  2. ^ Charlotte Adams (1 September 2010). "DO-178C nears finish line, with credit for modern tools and technologies". Avionics Intelligence. Retrieved 23 October 2010. The industry expects the final package —DO-178C— to be released in the first quarter of 2011 and be mandated six to nine months after ratification.
  3. ^ "Summary of Difference Between DO-178B and DO-178C". FAA Consultants.com. Qualtech Consulting, Inc. Archived from the original on 27 August 2010. Retrieved 23 October 2010. The release of these long anticipated standards will occur in mid 2011 and be recognized by the Certification Authorities in 2012.
  4. ^ Leslie A. (Schad) Johnson. DO-178B, Software Considerations in Airborne Systems and Equipment Certification ( in the context of software development for military aircraft, a practitioner's discussion of the evolution of the current practice and application of RTCA/DO-178B). Boeing Commercial Airplane Group. p. 11. Retrieved 3 March 2022.
  5. ^ "Archived copy" (PDF). Archived from the original (PDF) on 3 September 2014. Retrieved 2013-08-08.{{cite web}}: CS1 maint: archived copy as title (link)
  6. ^ Dale, Chris; Anderson, Tom, eds. (2010). Advances in systems safety : proceedings of the Nineteenth Safety-Critical Systems Symposium, Southampton, UK, 8-10th February 2011. London: Springer. pp. 298–299. ISBN 9780857291325.
  7. ^ "SC-205/WG-71 Plenary". Archived from the original on 19 July 2011. Retrieved 2010-09-18.
  8. ^ Bill StClair & Tim King (7 March 2012). "DO-178C brings modern technology to safety-critical software development". Military Embedded Systems. Retrieved 17 April 2012.
  9. ^ "DO-178C Enhances Safety-Critical Avionics Software Development". Electronic Design. Retrieved 17 April 2012.
  10. ^ a b RTCA/DO-178C "Software Considerations in Airborne Systems and Equipment Certification", p. 116. "One example is the term “item development assurance level” (IDAL), which for software is synonymous with the term “software level."
  11. ^ RTCA/DO-178C "Software Considerations in Airborne Systems and Equipment Certification", p. 41
  12. ^ RTCA/DO-178C "Software Considerations in Airborne Systems and Equipment Certification", Annex A
  13. ^ "HighRely Synopsis of National FAA Software and Hardware Meeting Includes DO-178C Status". 2006. Retrieved 30 September 2009. DO-178C will contain more details on software modeling and the potential ability to use modeling to supplant some of the verification techniques normally required in DO-178B. DO-178C will also more fully address OO (Object Oriented) software and the conditions under which it can be used and the certification ramifications of OO in DO-178C.
  14. ^ RTCA/DO-178C Software Considerations in Airborne Systems and Equipment Certification. RTCA, Inc. 2011.
  15. ^ Pothon, Frédéric. "Principles and benefits of using DO-330/ED-215" (PDF). validas. Retrieved 3 October 2019.
  16. ^ Pothon, Frédéric; et al. (2012). DO-178C/ED-12C versus DO-178B/ED-12B Changes and Improvements (PDF). p. 49. Retrieved 5 January 2015.
  17. ^ Pothon, pp. 43-46
  18. ^ Pothon, p. 14
  19. ^ "Achieving DO-178C compliance with Parasoft Development Testing". Archived from the original on 11 September 2014. Retrieved 2013-03-07.
[edit]