News

Featured Question

Map and Requirements‎ > ‎FAQs‎ > ‎

The Cloud and Security Issues

What is the cloud?
The introduction of inexpensive processors, memory, storage, and bandwidth has resulted in the rise of a new breed of Internet hosting services based on a scalable, fine-grained, pay-per-use model. Accordingly, cloud computing represents a new model for the outdated model of IT infrastructure. Under the cloud model, BioSense 2.0 has a separate collection of servers via shared hosting providers that provide scalable, pay-per-use services at low cost. With this service model, we can avoid the expense of purchasing computer and network hardware, which must be physically housed and secured, powered, and cooled, all while rapidly becoming obsolete.

 

What happens if the cloud fails? Is there a back-up? Is there a plan for system access and disaster recovery if BioSense 2.0 goes down?
The cloud is designed with availability and reliability in mind from the ground up. The BioSense 2.0 systems (hosted in GovCloud) are continually monitored and backed up on a nightly schedule. If a system fails, we can typically restore from backup within minutes. However, a certain small degree of downtime is necessary, both for normal system maintenance, as well as unforeseen issues that inevitably arise with any IT system and any service provider. AWS availability for the project at this point is 99.9% uptime, which is roughly equivalent to one full day of downtime over a given three year period. While 100% availability is always the goal, it's important to recognize that greater availability requires higher costs and higher maintenance investments. The goal is to meet the needs of the BioSense 2.0 community both in terms of availability and costs.

 

I don’t know much about cloud computing, and will need to answer questions from policymakers about data security, cost, etc. Where can I find more information about it that non-informaticians will understand? 
Cloud computing represents a new model for the outdated model of IT infrastructure. Under the cloud model, BioSense 2.0 will have a separate collection of servers via shared hosting providers that provide scalable, pay-per-use services at low cost. By performing these operations using a cloud provider, we share costs with thousands of other customers, take advantage of economies of scale, and save money. While private and confidential data are housed by the provider, professional cloud providers are very serious about protecting their customers’ data. With proper security practices, cloud hosting is just as secure, and can be more secure, than traditional self-run IT operations. In order to ensure security, ASTHO screened Amazon Web Services using strict security criteria meeting the concerns of federal, state, and local stakeholders. Further, BioSense 2.0 meets all federal security criteria, including a CDC Certification and Authorization (C&A) at the Federal Information Security Management Act (FISMA) moderate level. BioSense 2.0 also utilizes National Institute of Standards and Technology (NIST) standards for computer security (NIST SP 800-18, SP 800-37, NIST SP 800-53).

Recommended further reading:
How will we know our data are secure and private? Where is our data going?
Your data, along with all data from the other BioSense 2.0 participants, will be housed on servers in a secure data center facility operated by Amazon Web Services. Maintaining the security and privacy of these data is a central component of the BioSense 2.0 IT strategy. We are employing industry-standard, state-of-the-art technologies and practices to ensure that only authorized users can access the system. 

The BioSense 2.0 partitioned storage architecture makes use of Amazon native infrastructure protections. Amazon’s distributed storage (S3) provides robust countermeasures against data loss. Objects stored in a Region never leave the Region unless you transfer them out. For example, objects stored in the GovCloud never leave the GovCloud. Authentication mechanisms are used to ensure that data is kept secure from unauthorized access. 

We are also implementing access control tools for data providers, so that you can specify who within the community can access your data, and under what conditions. Lastly, as the system evolves, we will periodically engage outside security consultants to perform security audits and testing.


What happens if the BioSense system goes down? 
As part the BioSense 2.0 cloud deployment, we are working to create a highly reliable system. Failures of individual system components are inevitable over time, but we have designed the architecture with redundancies and backups in place. When something goes wrong, we plan to be able to either restore service rapidly or even adapt automatically and sustain normal service without any interruption. As part of the process of bringing individual data providers on board, we will discuss different failure scenarios and put in place measures to adapt accordingly. 

As with any IT system, we expect some amount of downtime or service interruption, both planned and unplanned. Amazon advertises Annual Uptime Percentage of at least 99.95%. The Amazon S3 storage infrastructure employs multiple copies of data to ensure it can be recovered if necessary. However, in the event of a public health situation, where the system is likely to see a spike in utilization and any downtime is undesirable, with our cloud-based hosting, we can allocate more resources instantly to increase both performance and redundancy.

 

Does anyone have master access to see all data in the system? 
The BioSense 2.0 systems administrators, working on behalf of ASTHO and the BioSense 2.0 governance body, will require master access to all data in the system to perform maintenance and troubleshooting and provide support to all BioSense 2.0 users. However, access will be limited to a small group of users, experienced in managing personal health information. 

What will be the process to grant or revoke access to my data? 
Each jurisdiction will designate an administrator after a DUA is signed. He or she authorizes access to and the level of data to be shared using a secure Web dashboard tool for designating access control specifications.