Securing Platforms and Networks

Securing platforms and networks
The research theme investigates selected technologies and engineering processes that have the most potential for mitigating the inevitable vulnerabilities of software platforms. Security and reliability of modern distributed information systems depends critically on both mobile and cloud computing platforms and on the communication networks that connects them. Both the computing platforms and network are highly complex and based on open architectures. In such systems, the presence of malicious users, code and network traffic is almost guaranteed. It is therefore necessary to harden the platforms against attacks and to create a small trusted computing base on which secure applications can be built. Similarly, the communication networks need to be designed to be resilient under continuous attacks and to provide safe, isolated environments for critical functions and applications.

During the project trusted and data secure service frameworks, technologies, processes and implementation were created to support public sector and private companies with business critical applications. The goal is to build up a generic model and practices for secured application management that can be utilized in public and private sector organizations in Finland.

The research theme investigated selected technologies and engineering processes that have the most potential for mitigating the inevitable vulnerabilities of software platforms. The focus areas were applications of secure hardware, hardware-supported software isolation and virtualization, and the standards and processes, such as rigorous security testing, which help ensure the resilience of critical system components.

Since the connectivity in modern highly distributed information systems is typically provided by wired and wireless network operators, the research focused on new IP networking technologies, especially software defined networking (SDN), and the resilience of operator and datacenter networks built on these technologies. The end-to-end security and resilience of communication is typically part of the service design and implemented on the platform or application level. Business-critical applications must take responsibility for end-to-end cryptographic protection and security policy enforcement, and they must be built to withstand disruption of the underlying communication channels.

Situation awareness is important for large number of organizations but is not among their core business activities. Thus, cyber security visualization, situation awareness and the related training require business models to enable service providers to provide sensitive real-time situational information (possibly across organizational borders) to the client organizations. Some of the situational information could be directed to public bodies.

The focus areas were applications of secure hardware, hardware-supported software isolation and virtualization, and the standards and processes, such as rigorous security testing, which help ensure the resilience of critical system components. The connectivity in modern highly distributed information systems is typically provided by wired and wireless network operators. Computer network architectures are currently undergoing the biggest change in decades: new networks are built using software-defined networking (SDN) technologies and patterns, such as OpenFlow and network virtualization. In this respect the research focused on new IP networking technologies, such as SDN, and the resilience of operator and datacenter networks built on these technologies. The end-to-end security and resilience of communication is typically part of the service design and implemented on the platform or application level. Business-critical applications must take responsibility for end-to-end cryptographic protection and security policy enforcement, and they must be built to withstand disruption of the underlying communication channels.

Security Technologies and Secure Management for Future Networks

There is a fast on-going change in the technical architectures and topologies of the Internet: in the near future, network architectures may be based on Software Defined Networking (SDN) and Network Functions Virtualization (NFV). These concepts create new virtual network elements and interfaces, which affect the logic of the network operation and traffic management and the entire system architecture.

In data centers, wide-area networks, and cellular mobile networks, these new network technologies enable the flexible and elastic on-demand provisioning of secure network services. This evolution will lead to a significant change of network architectures, protocols and management. It will also create new business possibilities such as multi-tenant core-network models as well as other unforeseen opportunities, especially in the security area.

Cyber Trust program developed a technical SDN environment that act as a testbed for integrating the NFV and SDN architectures and for the investigation of business opportunities, and well as for analysis of the related cyber threats, using existing Realistic Global Cybersecurity Environment (RGCE) as one source. Also the new technical security solutions were developed, and these were improved and built based on the existing network technology. The goal was to both solve identified security issues in SDN and to exploit SDN technology for enabling new security services and business.

As sharing of mobile network resources becomes more common – this is also encouraged by the EU – it will open up new revenue opportunities for classical mobile operators to leverage their network infrastructure and services. Thus they will exploit the new SDN and NFV technologies. The new network topologies and different kind of NFV architectures will not only bring new flexibility but also threats, such as spreading of attacks through the network. Those attacks need to be prevented and detected. This highlights the need for isolation mechanisms and for the effective deployment and chaining of security-related services.

Future SDN

New networks will be built using SDN technologies where routing and network topology are defined at a software-based controller rather than at the routers, switches and physical links. Controller will be the one of the key components in the SDN.

Puikkari SDN platform provided by JAMK is a proof of concept implementation of customer portal for modern next generation computer network using SDN and Network Function Virtualization (NFV) technologies. The platform enables the network to provide different kinds of services for the network users and administrators. The service could be for example a firewall that is placed in front of a network customer or a network traffic monitoring tool for network administrator that is used to debug network problems.

The platform was developed to handle a network of a small imaginary ISP in a virtual test internet setup. The project focused on researching what kind of business opportunities these new network technologies could have and how they affect the network security. Although the main target for the platform in the development phase was a small ISP, this platform could be used in a network of any size, for example campus networks or even small home networks.

JAMK created virtualized Cyber Trust SDN testbed in the Realistic Global Cyber Environment (RGCE). This environment has been available for research consortium. This testbed was developed to serve as a platform to test multiple different scenarios in a SDN network. The environment includes a controller network where any SDN controller can be deployed with ease. In the data plane SDN switches are virtualized Open vSwitches and routers are Vyos virtual machines. The network BGP (Border Gateway Protocol) peers to RGCE to imitate a real Internet connection as much as possible.

Puikkari and RGCE testbed has been used by JAMK students to learn SDN. Both test environments has been used by thesis workers to learn SDN, create proof of concepts and test the solution for their thesis. Both test environments can be seen as valuable assets for further research.

For Elisa the background for this work was the upcoming change in the technical architectures and topologies of the Internet. A great part of the work concentrated on business cases and developing related technical architectures. Together with JAMK and University of Oulu Elisa set up test systems and validated the use cases, which concentrated on Software Defined WAN (SD-WAN) technologies and corporate network CPE systems. Furthermore, Elisa took part in business model development with Ericsson and University of Oulu.

Joint testing with JAMK provided us good technical and business insights to corporate customer CPE and FW management.  New know-how on SD-WAN security was gained and how to manage large global WAN networks in the 2020s.

Elisa also tested with JAMK a number of SDN based CPE cyber security scenarios, which will guide further product development and productization. Main focus of the collaboration was to investigate how HPE SDN VAN Controller along with the SDN applications such as HPE Network Protector, can be used to filter malicious DNS traffic in different parts of the network. The aim was to find out how the system needs to be configured in three different network topology cases, each having the switch located either inside the ISPs core network, at the edge of the network or inside the customer premises.

In terms of SD-WAN products and services, Cyber Trust offered Elisa a context to study next generation WAN services and how cyber security technologies will be and should be implemented there.

Part of the research of VTT involved the advancement of the SDN-security. The tasks included setting up and experiment with the latest SDN platform technologies, including their management (technology basing on OpenStack). The technology development was highly experimental. The goal of this work was to gain more know-how for securing Network Functions Virtualization (NFV) elements in OpenStack environment.

The SDN security work with a major company produced new environment for experimentation, with insights into its capabilities and weaknesses, and future improvements. The technology is still nascent, and VTT was able to provide some stepping-stones eventually leading  to its future adoption.

The research group of University of Jyväskylä analyzed the security vulnerabilities of the SDN networks. The main outcome was the method they developed to analyze SDN network traffic in order to find out anomalies (attacks) of the network traffic. Researchers concentrated on timely detection of intentional co-residence attempts in cloud environments that utilize software defined networking. As SDN enables global visibility of the network state which allows the cloud provider to monitor and extract necessary information from each flow in every virtual network in online mode. Researchers analyzed the extracted statistics on different levels in order to find anomalous patterns. The detection results obtained showed that the co-residence verification attack can be detected with the methods that are usually employed for botnet analysis.

SDN security research was done by other research organisations as well

5G Security

It was important to gain more understanding on how novel technologies will mold new networks, especially 5G. 5G networks are not only faster, but they provide a backbone for many new services, such as IoT and the Industrial Internet. Those services will provide connectivity for everything from autonomous cars and to remote health monitoring through body-attached sensors, smart logistics through item tracking to remote diagnostic. Most services will be integrated with cloud computing and novel concepts, which will require smooth and transparent communications between user devices, data centers and operator networks.

Together with Ericsson, Oulu University and Elisa made research, and these results were disseminated eg. in NSS2017 conference and in Wiley book Comprehensive Guide to 5G Security.

The book provides a reference material to a comprehensive study of 5G security. It is the first comprehensive guide to the design and implementation of security in 5G wireless networks and devices. It offers an insight into the current and future threats to mobile networks and mechanisms to protect it. It covers the critical lifecycle functions and stages of 5G security, and how to build an effective security architecture for 5G based mobile networks.  The book offers security considerations for all relative stakeholders of mobile networks.  It covers critical lifecycle functions and stages of 5G security and how to build an effective security architecture for 5G based mobile networks.

Cyber Trust -program and its results will guide also Finnish cyber security research. One important aspect is here the wide scope and holistic view on this topic. Another aspect is the relevance of cyber security in 5G, and how the SDN/NFV technologies will affect this development.

Secure Wireless Technology Platform

There are approximately 180 million Enterprise BYOD (Bring Your Own Device) devices globally and the number is expected to increase to 390 million by 2015. The total BYOD and Enterprise Mobility market is expected to reach $181.39 Billion by 2017 with a CAGR of 15.7%.

Commercial technology is adopted in systems where closed solutions were earlier used. In US FirstNet has been allocated a nationwide spectrum license where they will provide a 3GPP Long-Term Evolution (LTE) technology based network that will provide the first broadband network dedicated to public safety. Similar development is also taking place many countries in Europe, Middle East and Asia, UK Home Office being the prime example. Security plays an increasingly important role in all these markets, on top of their other specific requirements.

Device manufacturers have received requests from corporate customers on adding further privacy and security features to their offering. The requests for privacy and security stem from the US and Asia dominance in smartphone operating systems and related cloud based service businesses, which have raised vocal concerns about the security, privacy and trustworthiness of current (mobile) digital environment.

Certification requirements

How a company can convince the customer that the security features offered are really there and working flawlessly? Product certification is one way to show that product does what it promises. Bittium studied national security certification requirements for restricted (ST-IV) and confidential (ST-III) information handling. Advanced authentication means were defined and demonstrated to meet the requirements.

The research done in the various DIMECC projects have been also taken into account in standardization work  in different stardard organisations. These are  3GPP (the 3rd generation partneship project), IETF (Internet Engineering Task Force) and ETSI (European Telecommunicattions Standards Institute). The 3rd Generation Partnership Project unites seven telecommunications standard development organizations. The mission of the IETF is to make the Internet work better by producing high quality, relevant technical documents that influence the way people design, use, and manage the Internet. ETSI, the European Telecommunications Standards Institute, produces globally-applicable standards for Information and Communications Technologies, including fixed, mobile, radio, converged, broadcast and Internet technologies.

Products are today more and more complex typically including tens, even hundreds open source components. No software can be tested completely and security vulnerabilities can be found in a certain software component even years after its release. Therefore it is essential that possible software vulnerabilities and exploitations are checked regularly. Process improvement and tool development were done on this area during the CyberTrust program. The developed process was demonstrated successfully in Bittium and it has been taken in to daily use.

Public safety and security areas have been until today very different from commercial solutions and devices. The situation is changing and commercial consumer based technology will be adapted in the future also in public safety. This changes the business model for the ecosystem. University of Oulu Business School has analyzed the current and new business environment.

Improving Security in Wireless Product

During the program the special Cyber Trust Handbook for Improving Security in Wireless Product was created by Bittium, University of Turku, University of Oulu, VTT, and Capricode. The focus of the book is the improvement of the security of wireless product during the product’s life cycle from development to the product’s end of life.

  • Integrity Protection

Integrity protection means how the software’s integrity can be maintained and possible changes by malware can be detected without a delay. Integrity protection requires hardware trust anchor and mechanisms to protect both boot chain and user space components. Integrity protection ensures that the platform software’s integrity is kept. However, in many products software applications can be downloaded and the same time devices can be mobile. Cybert Trust program presents improved methods to secure that applications cannot use the device resources in a malicious way.

Integrity protection mechanisms are needed to prevent unauthorized modifications to system software and applications. Sometimes this can even be a safety issue if unauthorized modification is causing danger, like radio interface exceeding permitted Specific Absorption Rate (SAR) values. Integrity protection is also a corner stone of security and it is needed to guarantee that all protection mechanisms expect to operate as designed. Whether there are complex access control or Digital Rights Management (DRM) mechanisms to protect content, those all will fail if attackers are able to manipulate components that are supposed to be trusted. Therefore, trust requires that integrity of components handling confidential information should be verified before these components are used.

Integration protection mechanisms are developed to support chained verifications where each component in boot chain is verifying the next component before passing control to it. For example, the first stage bootloader should verify the second stage bootloader and the second stage bootloader should verify kernel image. Verification is typically done by calculating cryptographic hash of the component and then verifying the result using the signature of the verified image. This verification chain requires that there is a trusted starting point. The system must have security hardware like Trusted Platform Module (TPM) or ARM TrustZone, which allows binding of device identity and the use of device specific signing keys.

Integrity protection requires mechanism to protect both components of the boot chain and userspace components. Hardware-based trust anchor is needed to establish root of trusts for measurement. Linux kernel provides many alternatives to integrity protection. Either file-based or block-based approach can be used. Software update mechanism should also be protected as it can be misused to install malware to the system. Remote attestation can be used to provide integrity proof of the system to remote verifier.

  • Fuzz Testing in Agile Test Automation

No design is perfect and less is the implementation. Therefore testing is needed and fuzzy testing has turned out to be a very effective testing method. Cybert Trust program introduces methods to make fuzzy testing more effective and easier to use.

Fuzz testing (fuzzing) is a practical way of testing software for security vulnerabilities arising from processing external input and is usually included in secure development lifecycle models. In 2015, nearly all high-impact vulnerabilities with a Common Vulnerabilities and Exposures (CVE) entry were found with fuzzing. Fuzzing is clearly something that should be done when developing secure software, but applying it efficiently as a part of an agile process can be challenging.

Fuzzing can be integrated into agile software development and test automation, but there are difficulties. The main challenge is that fuzzing needs to continue to show value while requiring minimal effort.

The effectiveness of fuzzing depends on how well it is executed. This begins from data flow based threat modelling, which should be continuously updated. Threat modelling should also take results from previous fuzzing into account. If new issues are found with fuzzing, the interface should be a candidate for increased fuzzing efforts. Even limited dumb fuzzing with manual instrumentation can find some low-hanging fruit, and thus should be done for all inputs of the system. The quality of third party components should also be considered.

To stay in use, fuzzing needs to be automated. Even then, the infrastructure will eventually stop finding new bugs. Up to a point, improving test case generation (better models or samples, adding new fuzzers) and instrumentation (use of new type of memory debugger, custom test harness) will continue to find new bugs. Many of these activities only require a small one-time cost, and are easily justified. Others require more significant efforts, including the use of experts.

Coverage guided fuzzing is a promising field, and can improve the quality of fuzz test campaigns. In addition, coverage provides insight into whether placing further effort into improving the test campaign would be useful.

A key question is the availability of the skills required to setup an effective fuzz test campaign. The Software Security Group is pivotal, but eventually testing will have to be done by developers with the proper tools and skills. Setting up ways of working to make fuzzing more effective is one way of doing this. This could be, e.g., standardized ways of building instrumented builds and ways of leveraging results from other activities, like automatically reusing API documentation as a model for fuzz tests.

The workflow used by fuzzing is somewhat different from traditional testing and this needs to be accounted for. Instead of being an activity that responds to the previous development increment, automated fuzzing is a background process that needs maintenance. Currently, this requires some expert knowledge, but better tools could also fill this gap.

Coverage-based methods are an efficient method of improving the test case corpus, thus improving test quality. Ideally, they would also make it possible to automatically direct fuzzing efforts into areas, which have recently changed in the code. This would also serve as evidence that the new functionality has been tested.

Pietikäinen, P., Kettunen, A., & Röning, J. (2016) Steps Towards Fuzz Testing in Agile Test Automation. International Journal of Secure Software Engineering. 7, 1 (January 2016), 38-52.

  • Product Security Incident Response Team

Product Security Incident Response Team describes how product software’s vulnerabilities can be easily monitored. All organizations are connected to the outside world using computer networks. Companies and societies are depending on working communication networks. The security of the communication is more important today than never before. Several means are applied to improve the security. Computers have anti-virus and malware detection software. Firewalls protect companies’ networks; all connections are passed through proxies which makes it easier to leverage organization level policies. Network traffic can be monitored with intrusion detection system which alert if suspicious traffic is detected. There are administrative tools that help information management personnel to check the status of computer software versions and automatically install security patches.

New and even bigger challenge comes when different control systems including critical infrastructure and autonomous machines are connected to large scale networks which can be accessible from public internet and therefore being vulnerable to same kind of threats as computer networks. An industrial site can have hundreds, even thousands of different types of devices connected to industry network. Although accessing these devices is much harder than computers in a computer network, it is still possible. Same kind of administrative tasks are much harder to execute for industrial network than in typical computer network which includes Windows machines and Linux servers, for example. Therefore it is crucial that all the vendors follow good practices for building secure devices from the beginning.

Cyber Trust program introduced one approach how the security of a product’s software can be monitored during the product’s life cycle. Product Security Incident Response Team (PSIRT) is an organization whose responsibility is to proactively scan new vulnerabilities related to the product software and react if those can be found. Vulnerability scanning is an important part of PSIR operation. A concrete example is presented how the vulnerability scanning can be organized.

There is not only one way to organize PSIRT activity but it depends on several issues. One thing is, however, common to all solutions. The organization has to define what PSIRT does. PSIRT operation costs  may come from new tools and equipment and from personnel cost. No matter how small the operation is, someone still needs to do something and that needs money. However, the overall cost may be less than with the current operation but usually organization sees all new functions as an additional cost. Therefore there has to be clear mission statement what the new function – PSIRT – does and delivers.

Checking of software vulnerabilities is an important part of software development and maintenance. It is also a fundamental part of PSIRT operation. Vulnerability scanning is not a one shot activity but it has to be done on regular basis because new vulnerabilities may be found at any time. At a development phase it is important to check that a new software component, which is planned to be used, does not contain such vulnerabilities which prevent the use of the component in the product. In case of an open source component the initial vulnerability analysis may be done as part of open source governance when also open source licenses are checked. After that the software components should be checked constantly.

For vulnerability scanning the access to the vulnerability database is mandatory. There are several public Common Vulnerability and Exposure (CVE) databases like http://cve.mitre.org, https://web.nvd.nist.gov/view/vuln/search, https://www.exploit-db.com/, https://www.circl.lu/services/cve-search/. There are also product specific databases as WindRiver’s https://www.windriver.com/security/cve/main.php and non-public commercial database as Risk Based Security‘s https://www.riskbasedsecurity.com/vulndb/.

Because security is everyone’s concern, the basic CVE databases are public and common for all database providers. However, there are still differences how data is presented. For example Mitre and NVC has a very close relation. Mitre feeds the CVE list to the U.S. National Vulnerability Database (NVD), which then builds upon the information included in CVE entries to provide enhanced information for each CVE Identifier such as fix information, severity scores, and impact ratings. NVD also provides advanced searching features such as by individual CVE ID; by OS; by vendor name, product name, and/or version number; and by vulnerability type, severity, related exploit range, and impact.

Although public CVE databases can be accessed directly, it might expose critical information if the access is used to query information related for the product software. If the queries are monitored then a listener can get the information about all the software components related to the product and then possible uncovered vulnerabilities which can be utilized on targeted attacks. The better approach is to setup and maintain a local CVE database from which the product specific CVE searches are done.

Product security is not something which is built into the product during the development and then forgotten. Especially today when the software is built from hundreds open source components it is impossible to verify thoroughly all the components. Therefore the vulnerabilities should be monitored constantly. That is an important function of Product Security Incident Response Team (PSIRT). There are no strict guidelines how the organization should organize PSIRT. That depends on the number of products, company size and many other things. A solution for vulnerability scanning was introduced. The presented framework was developed in the Cyber Trust program and it has been successfully used in Bittium. However, not all open source projects report formally CVEs the security flaws found in the project. Many security errors are corrected and the track for that can be found only in the changelog or version control system’s commit comments.

  • Mobile Device Management

Digitalization and cyber security trends will continue and form basis also for the device management needs and evolution. Everything will be connected, all data is analyzed and cyber security awareness brings market growth possibilities for device management products. Product life cycle and window of opportunity for growth is all the time shorter and all markets are global.

In emergence of IoT one important issue to consider is who will take the responsibility for IoT devices. In case of company using IoT as part of their process (e.g. factory) the answer is quite obvious: the IT department, that manages also all other devices, takes responsibility over IoT as well. In a case where the IoT enables new business via digitalization the business unit could take responsibility directly. Usually this is done through utilizing a separate innovation department in the research and development organization or Operational Technology (OT) organization.

In the past, Information Technology (IT) and Operational Technology (OT) were seen as two distinct domains of a business. The former focused on all technologies that were necessary in order to manage the processing of information, whereas the latter supported the devices, sensors and software that were necessary for physical value creation and manufacturing processes. One of the factors that is reshaping IoT market is the convergence of Information Technology and Operational Technology which is basically a “must have” in order to scale to IoT vision device amounts and to keep security at proper level.

Convergence of networks – both industrial (OT) and enterprise (IT) are enabling applications such as video surveillance, smart meters, asset/package tracking, fleet management, digital health monitors and a host of other next-generation connected services.

Main items in future device management systems are related to scaling, customization, flexibility and integrations of the device management system. Main need and growth in device management business will be in IoT due to automation and cyber security needs in that field.

Cyber Security is one of main concerns and major roadblock for vast expansion in amount of connected devices. Security and IoT evolution will need standardization, alliances, good co-operation capabilities and close integration between different technology providers.

Evolution in enterprise device management will divide to large enterprise EMM needs and small and medium enterprise basic MDM needs. Organization needs are also divided into basic security level needs and advanced security needs as for example in governmental organizations.

Contributors of this research theme: Bittium, Nokia, Ericsson, Elisa, Capricode, VTT, University of Turku, University of Oulu