A startup is running a pilot preparation of around a hundred sensors to live street noise and air quality in urban areas for three months. It absolutely was noted that each month around 4gb of device data is generated. The corporate uses a load balanced auto-scaled layer of ec2 instances and rds info with five hundred gb standard storage. The pilot was a hit and currently, they need to deploy a minimum of 100k sensors which require to be supported by the backend. The user wishes to store the data for a minimum of a pair of years to research it. What should a user do?
Answer Posted / Rohit Gautam
To handle the increased data storage requirements, the user can consider using Amazon S3 as a cost-effective and scalable solution for long-term data storage. They can also use AWS Glue for data integration and processing, and Amazon Redshift for data warehousing to analyze large amounts of data efficiently.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category