A startup is running a pilot preparation of around a hundred sensors to live street noise and air quality in urban areas for three months. It absolutely was noted that each month around 4gb of device data is generated. The corporate uses a load balanced auto-scaled layer of ec2 instances and rds info with five hundred gb standard storage. The pilot was a hit and currently, they need to deploy a minimum of 100k sensors which require to be supported by the backend. The user wishes to store the data for a minimum of a pair of years to research it. What should a user do?
Answer / Rohit Gautam
To handle the increased data storage requirements, the user can consider using Amazon S3 as a cost-effective and scalable solution for long-term data storage. They can also use AWS Glue for data integration and processing, and Amazon Redshift for data warehousing to analyze large amounts of data efficiently.
| Is This Answer Correct ? | 0 Yes | 0 No |
How does component services used for amazon simpledb?
What is ec2 in aws?
What’s the difference between scalability and elasticity?
How to monitor the network traffic?
What are different categories of amis.
What are policies and what are the different types of policies?
In a VPC how many EC2 instances, you can use?
How will a user make a copy the instances?
Explain how buffer is used in amazon web services?
If launch an standby rds instance, they will it be in the same availability zone as my primary?
What Is Standard Instances?
What are availability zones?