User:Shaikha.j
COTS
[edit]What is COTS
Commercial Off-The-Shelf (COTS) software functionality acquired from an independent, third-party that is used on an “as is” basis. COTS software products are packaged which include a “description or definition of the functions the software performs, documented to good commercial standards, and a definition of the resources needed to run the software”.
As noted above, COTS usage become very widely in many systems and becomes a part of almost all complex software products. Below some system examples that have been used the COTS as a basis for its development: • Database management systems • Geographic Information Systems (GIS) • Operating systems, including low-level software such as device drivers, windowing systems, etc. • Middleware (or services for communication and distribution)
Why COTS
COTS software has been widely used, due to the need of saving time, efforts and money. Organizations that adopt a COTS-based systems approach generally expect one or more of the following benefits:
• Lower development cost • Use of proven reliable software to reduce risks (even some challenges has been shown) • Increase functionality • Reduce production time • Reduce efforts needed in development software • Adherence to standards • Availability of expertise on COTS product in the market place.
COTS Challenges
Using COTS-based software can lead to more rapid or less costly system construction if built and conducted on a good understanding for the need of the system. But using COTS has raised some challenges which became a big issue for the software developers.
• Ties the system capability and evolution to vendors. May still have to customize COTS to fit the product developed. • Unavailability of source code & expertise within the organization. • Knowledge of quality and reliability difficult to acquire. • Constant churn of COTS software leads to high maintenance cost and may get difficult to build, field and support. • Need for training & support from COTS vendor • Acceptance testing and configuration management are critical. • Difficult to isolate defects between COTS and the system software developed.
COTS based software system
The rationale for building COTS based systems is that they will involve less development time by taking advantage of existing, market proven, vendor supported products, thereby reducing overall system development costs. But because of the two defining characteristics of the COTS (lack of access to product source code, and lack of control over product evolution), there is a trade-off in using the COTS approach in that new software development time can indeed be reduced, but generally at the cost of an increase in software component integration work. Moreover, using COTS software also brings with it a host of unique risks quite different from those associated with software developed in-house. Included among those risks, that should be examined when determining the true cost of integrating a COTS software component into a system, is security.
Security in COTS based software system
Definitions of Security Terms
The term "security" can have many different meanings and be used in many different contexts. The overall notion of security is separated into attributes, pressures, or mechanisms. A security attribute is a characteristic that is desirable in a software component or system such as confidentiality, integrity, and non-repudiation. A security pressure is a force that can negatively impact the security attributes of a system, for example, threats, vulnerabilities, and risks. A security mechanism is an approach or method used to protect the security attributes of a system from security pressures, for example, identification and authentication, authorization, and cryptography.
Attributes
• Confidentiality: Sensitive data is held in confidence, limited to an authorized set of individuals or organizations. • Integrity: Processes do only and exactly what they are stated to do and data are not modified during storage or transmission. • Non-repudiation: The sender of data is provided with proof of delivery and the recipient is assured of the sender's identity, so that neither can later deny having processed (i.e., having sent or received) that data. Pressures • Threat: Any circumstance or event with the potential to cause harm to a system in the form of destruction, disclosure, modification of data, and/or denial of service. • Vulnerability: A weakness in system security procedures, system design, implementation, internal controls, or other weaknesses that could be exploited to violate system security policy. This may involve unauthorized disclosure, unauthorized modification, and/or loss of information resources, as well as the authorized but incorrect use of a computer or its software. • Risk: The probability that a particular threat will exploit a particular vulnerability of the system, usually involving an assessed balance between threat and vulnerability. Mechanisms • Identification and Authentication: Verification of the originator of a transaction, similar to the signature on a check or a Personal Identification Number (PIN) on a bankcard. • Authorization: The granting of access rights to a user, program, or process. • Cryptography: The protection of data to make it unintelligible to other than authorized recipients. Many techniques are known for the conversion of data, called plain text, into its encrypted form, called either cipher or cipher-text.
Need for Security in COTS based Systems (CBS)
An essential characteristic of the commercial world is the need for openness, which makes COTS components especially susceptible to malicious attack (and hence more difficult to trust). For instance, the product is available to anyone, both the integrators that develop a system using off the- shelf software components, and the threat agents (i.e., "hackers") who try to obtain access in a malicious or unauthorized manner. Since the hackers have access to the same products (and second-hand information about them) that the integrators do, they can purchase these components, install them in their environment, and pick and probe at the components in the privacy of their own tested until vulnerability is revealed. Once vulnerability is revealed, "hackers" can then use it to compromise any system that employs the use of that component. Hackers can also read bug lists, and use them to probe for other weaknesses.
The Attacker requires less knowledge from the intruders to break through the Security. This further leads to the need for security to be strengthened in case of especially CBS with its open nature.
Security issues in COTS Based System
Maintaining security of the software and the data can raise a number of issues with systems that contain COTS products, e.g.: • COTS products may introduce security vulnerability into the system, since neither the designers of the products nor the design and implementation are visible to the organization. This may demand special procedures for testing or certifying that the COTS products do not present security risks. Each time the COTS products are changed, the system may have to be recertified. The questions that arise involving the security factors now are: • Have the long-term security requirements and policies of the organization been identified and have the security capabilities of the COTS been evaluated relative to these requirements? • Can new security mechanisms be integrated into the system if required?
Security Risk Assessment
Assume each COTS component is a potential source for vulnerabilities; perform security risk assessments based on that assumption. In the context of using COTS components to achieving system security we can’t state that all commercially available components are trustworthy, for that COTS should never be considered in designing and developing critical software systems. Moreover, COTS component marketplace does not provide fully secure components that can meet all of the requirements specified by the system. Before a decision to avoid using COTS or to embrace COTS as a security solution is made, the actual security risk to the system should be identified carefully.
Typically this is done through a security risk assessment, which consists of identifying the following: - Security policy: a formal statement of the rules by which people who are given access to an organization's technology and information assets must abide. - Scope of the Assessment: some defined boundary to which the assessment will and will not address (electronic security (computers and networks) but not physical security (buildings and employees)). - Usage scenarios: system execution threads and functions under which the system is intended to function. - Assets: the data, information, and property that are under the protection of the system under assessment. - Threat Agent: identification of persons with an interest in obtaining access or information, modifying information, or interrupting services in a malicious or unauthorized manner.
In performing a security risk assessment, be suspicious. That is, immediately assume that the likelihood of compromising any particular COTS component is high. In other words, it is probably easy to compromise the component and difficult to detect the compromise. By taking this worst case perspective, it simplifies the task of assessing the actual impact upon the system. Consider the following: • If the impact to the system is low, then regardless of how easily it is to compromise the COTS component, the end result (of the compromise) is of little consequence to the system as a whole. • If the impact is high, but the countermeasure(s) are effective and affordable, then the potential for compromise is, again, of little consequence to the system as a whole.
If the use of COTS components are necessary in the system, there is a need to increase the trust in COTS components thus avoid the side effectives of them. One way, is to learn about the inner workings of the components just as hackers do. There are techniques to get insight into the components like using the operating system services and shared library protocols to introspect and snoop the component’s call stack and interfaces with other components. For example, if a component were to make an unexpected operation to an unknown network address it would raise suspicion about the integrity of the component one of these is the truss utility under UNIX. Another way is to isolate vulnerable components from the rest of the system applying for instance a component firewall which only permits access, control, or information between a known and tested component service and the rest of the system. TCP Wrapper (a public domain component) is an example of a component firewall that isolates UNIX-native network services (telnet, ftp).
Towards managing COTS risks
To effectively address COTS risks, the acquiring activity needs to implement a set of risk mitigation steps when the organization decides to go COTS:
-- Avoid COTS altogether and write all the software in house. Avoiding COTS software fully and write the software needed all from scratch, gives the vendor quite a bit of control but foregoes all benefits associated with COTS so is not generally desirable. -- Depend on the law for protection. The legal system might provide some respite for the vendor, who uses COTS software, but laws covering software are few and far between, and their enforceability is in question, due to unsatisfactory legal precedents. -- Demand warranties from COTS vendors. Warranties may be enforceable, but tend to be difficult to obtain since adequate testing is such a difficult proposition, even with an extremely large user base. -- Perform system level testing with the COTS software embedded. Most software security vulnerabilities result from program bugs and malicious misuse so methodologies for analyzing software to discover these vulnerabilities and potential avenues for exploitation must be taken seriously. -- Ask for independent product certification as opposed to personnel or process certification There are many certification standards that can be applied to software process, such as ISO 9000 Model. Also, there are others to certify the people who build software. Such certification models can not offer high levels of assurance since they do not test the actual software for robustness or security. -- Determine the robustness of the software if the COTS software were to fail One way for developer to gain reasonable assurance of the robustness of a COTS component is to test the component in house. Currently the best verification and validation technique for doing so is software fault injection. Fault injection techniques can be tailored to COTS systems by systematically injecting faults into the inputs of the COTS application, and then the outputs of the COTS software is observed for unacceptable results. -- Use software wrappers The fault injection techniques are used to identify unacceptable behavior by a COTS component. These techniques cannot automatically correct those problems. Since the system developer normally does not have access to the source of COTS components, it is also not feasible to repair detected faults. One way that can help alleviate the problem is for the developer to wrap the COTS software, disallowing it from exhibiting undesired functionality by placing a software barrier around the component limiting what it can do. Generally, the way in which such wrapping is done is by testing inputs before a call, and only allowing the call if the input is likely to produce acceptable behavior.