Best Practices for Open Software Ecosystems
Open Systems Architecture (OSA) is delivering on its promise across various systems. OSA’s core principle revolves around widespread access to standards, shared information among developers, and, in some cases, the availability of downloadable toolkits. This openness fosters a larger community of potential developers and applications, driving increased adoption and usage. However, this comes with a trade-off, as openness creates opportunities for cyber intrusion and attacks, and the possibility of suboptimal code entering the system due to OSA’s mechanisms.
Open Systems vs. Open Source
Open systems, characterized by a well-defined set of accessible architectures and standards, welcome a broad spectrum of developers and designs. The term ‘open’ pertains to the standards embedded in the architecture, with the development community varying from a known, restricted, and qualified group to a very large community. Examples of open systems include the Portable Document Format (and the International Standards Committee responsible for the PDF) and the Android environment.
On the other hand, Open Source denotes software cultivated by a diverse community, employing a universally exposed design and governed by an open license. It’s noteworthy that open system architectures can exist independently of open-source code. In open systems, software development may be exclusive to a specific set of developers but adheres to an open set of standards.
Discover Yourself, Understand Your Ecosystem
Open systems commonly involve an integrator who combines developer products into a usable unit for customers, ensuring standards compliance. The resulting ecosystem is a key outcome of the Open Systems Architecture (OSA) effort. Users, particularly in smartphones, leverage the integrator’s environment to choose developer products for their devices. The security of such systems depends on the robustness of the architectural and developmental processes.
Exploits often arise from unforeseen errors or suboptimal implementation, causing security concerns and potential performance issues in open systems. While a diverse developer community brings expertise, it also raises the risk of unintentional inclusion of suboptimal code.
Understanding open systems and their ecosystems is crucial for integrators. Knowledge of standards and their security implications must be incorporated into processes. Although concerns about excessive processes cooling the ecosystem exist, they are outweighed by the potential impact of a security breach.
Each interface in an open system offers an opportunity for exchange, but the integrator must consider the security implications. An open system should ideally define not only the shell and format of the interface but also the intended behavior, akin to defining both the shell and yolk of an egg. Integrators must systematically assess poor coding practices and implications of each standard interface.
To gain knowledge of standards, integrators can rely on widely adopted standards from reputable standards bodies. Testing processes, including a sandbox for developers and interface stubs and testing tools, help identify issues early. Additional testing at scale during system integration testing helps ensure the robustness of developer products before inclusion in system releases.Pooling solutions into libraries promotes code reuse and implementation improvements, enhancing knowledge of components. Libraries should include documentation, undergo compliance processes, and track every inclusion or change to identify vulnerabilities and limitations.
Supporting developers in refining their software engineering process.
Similar to integrators, developers’ processes can prevent the exploitation of vulnerabilities. Therefore, it is in the integrator’s best interest to assist the developer community in creating, testing, and understanding its implementations. Integrators can offer support in two ways:
- Providing developers with examples and guidance on best practices and processes, including workflow guidance.
- Granting access to known developers to knowledge bases and manuals.
This assistance enables integrators to offload testing to the broader community early in the development cycle, minimizing fix costs. One approach is to furnish developers with a consistent set of tools for testing software versions for standards and interface compliance, common exploits, unconscious inclusions, coding missteps, and other potential attack vectors. Interface stubs can ensure developers adhere to standards, leaving no room for cyber assault. Sharing tips and coding examples enhances the developer community’s knowledge.
For developers using commercial-off-the-shelf (COTS) products, integrators should be acquainted with that community and recommend suitable packages while being aware of the benefits and flaws in off-the-shelf components, which are readily available for attackers seeking exploits.
After bringing in applications or components from the developer community, a standard battery of tests should confirm their compliance and safety. During development, these tests should be automatic and include developer identification, compliance to standards checker, security scan (virus, exploits) checker, and a best practices checker. This initial layer of testing should quickly inform a developer if they meet the minimums for inclusion, with additional checks performed on a test bed. Modularity facilitates these checks.
In the long term, a plan must be in place for sustaining open system standards and processes, possibly requiring a bank of funds or budgeting for the longer term. Open system standards often become obsolete and need upgrading or replacement. As standards age and are replaced, their communities shrink, leading to a static document that can result in vendor lock, defeating a key benefit of open systems!Sustainment plans should consider cyber issues and other standards changes. Procurers should define funding and responsibilities for changes in contracts to address complaints like “we aren’t paid to fix this.” To stay open and abreast of recent developments, acquirers and integrators of OSA must account for standards updates and implementation changes.
Zero Trust
Trust is a critical factor when utilizing modules or applications developed by various parties within an open system. Nothing running on an open system, whether in a virtual machine or connected to it, can be inherently trusted. Therefore, all access and usage must be meticulously controlled. The adoption of elements within an open community should follow a defined process and take place within a specified container. The modularity of a system correlates with its openness, with increased openness posing greater trust challenges for the community. Clearly defining parameters for usage and access ensures that virtual entities and human entities understand the permissions for behavior and access concerning each service, application, module, user, and element of the system.
Developer modules should be isolated within a virtual box, permitted only specific interfaces and access to processing and network resources as outlined by the integrator. In many open systems, this isolation of modules is achieved using a virtual machine structure, ensuring that each module operates within a well-defined space with limits on processing access, memory location and access, and controlled communication access via defined and regulated protocols and ports. Access to resources should adhere to the interfaces defined within the space, and instances are confined solely to the designated box. Only the integrator-defined space should be authorized to instantiate, create, or destroy boxes.
Another critical capability is the implementation of a monitoring package that observes the box in real-time, issuing alerts and logging any unauthorized activities. Ideally, misbehaving modules should face additional restrictions and alerts, or all processes in the module should be terminated. Instances of virtual machines and users must authenticate before utilizing system resources. Rogue virtual machines must be controlled or terminated, as even unintentional processes can impact performance. Logging, an essential practice, should be routinely mined to enhance systems, improve performance, and identify weak or suspicious elements within the ecosystem.Encrypting storage within boxes and on common resources is an additional measure to bolster security. Similarly, encrypting communications between boxes may enhance security in some instances, although this comes with the trade-off of increased overhead. Striking a balance between performance and overall system security is imperative.
Facilitated by the integrator of an open system, moderated boards and blogs, linked to discussions on protocols and standards, offer developers insights into potential pitfalls during development. Concurrently, integrators gain knowledge about possible issues and exploits. While there is reluctance in revealing vulnerabilities, trusted developers should be informed about issues affecting the security of their builds when integrated into the open system. A resilient community may also involve in-person forums and discussions on security and cyber topics, providing opportunities to identify potential adversaries and counteract social exploits.
Conclusion
Open systems inherently present tradeoffs in the cyber realm. While their expansive ecosystem introduces the potential for misbehavior, it also offers opportunities for diverse inclusion and additional concepts. Mitigating measures can be implemented to address issues while leveraging the benefits. Understanding your community, acknowledging its limitations, and capitalizing on the strengths of the ecosystem are key considerations.