65 For artificial intelligence and machine learning to work properly, they need data – and plenty of it. Fortunately, that’s exactly what the digitized world is pumping out. In fact, there was a record high of 64.2 zettabytes of new data in 2020. Demand for storage capacity continues to grow, but not any old kind of storage will do. Because this data is so valuable, it must be stored in ways that are both safe, accessible and scalable. As more organizations run AI and ML workloads, they need a new approach to storage that can handle these massive workloads. A real-world use case is a good way to illustrate this point. Think about the travel industry and all the data that gets generated by the many airlines, car rental companies, hotels and so on. Now imagine you own a travel agency. What if you knew, for instance, where most airline passengers were headed on a particular weekend? That would be a huge competitive advantage. But arriving at useful conclusions like this is easier said than done. The travel industry generates about a petabyte of data every day, and some of that data gets duplicated by aggregator sites. Data of this nature is time-sensitive, so you would need a way to quickly determine which data is meaningful. That requires the ability to scale. Fast object storage helps overcome the challenge of retaining big data, so that organizations can extract the value from this data and move their businesses forward. Want to learn more about how object storage can help enable real-world AI deployments at scale to derive real business value? Read my recent piece for Unite.AI.