Amazon CEO Bezos issued a letter to shareholders, in which he mentioned that from 3% in 1999 to 58% in 2018, Amazon Sales of tripartite sellers increased sharply. During this period, Amazon’s revenue increased from $1.6 billion in 1999 to $117 billion last year. He also mentioned AWS and deep learning in a small amount.
Amazon advocates the builders culture – full of curiosity, adventurous and invented. Bezos said that the architect’s mentality helps us to approach huge, intractable opportunities with modest beliefs. Success can be achieved through iterations: invention, launch, reinvention, restart, restart, flush, repeat, again and again once. They know that the road to success is by no means straightforward.
AWS is the typical output of this culture. The letter writes that AWS’s millions of customers include startups, large corporations, government agencies and non-profit organizations, each of whom wants to build better solutions for end users. We spent a lot of time thinking about what these organizations want and what they want inside – developers, development managers, operations managers, chief information officers, chief digital officers, chief information security officers, and more.
Most of the content we build on AWS is based on listening to customers. Asking customers what they want, listening carefully to their answers, and making plans to deliver it quickly and thoughtfully (accelerating in the business!) is crucial. Without this obsession with customers, no business can prosper. But this is not enough. The biggest driving force will be that customers don’t know what they need, we must invent them on their behalf, and we must use our inner imagination to realize the possibilities.
AWS itself – as a whole – is an example. No one asks AWS, no one. It turns out that the world is ready and eager to get products like AWS, but I don’t know. We have a hunch, follow our curiosity, take on the necessary financial risks, and start building – reinventing, experimenting, and iterating as many times as we continue.
In AWS, the same pattern has been repeated many times. For example, we invented DynamoDB, a highly scalable, low-latency key-value database that is now used by thousands of AWS customers. In listening carefully to our customers, we heard loudly that client companies are limited by commercial database options and have been dissatisfied with their database providers for decades – these products are expensive, proprietary, with high lock-in and punitive license terms . We spent a few years building our own database engine, Amazon Aurora, a fully managed MySQL and PostgreSQL compatible service with the same or better durability and usability as a commercial engine, but at a cost of one-tenth. We are not surprised when this work is successful.
But we are also optimistic about the dedicated database of professional workloads. In the past 20 to 30 years, companies have used relational databases to run most of their workloads. Developers’ extensive familiarity with relational databases makes this technology the best choice, even if it’s not ideal. Although suboptimal, the dataset size is usually small enough that the acceptable query latency is long enough that you can make it work. But today, many applications store a lot of data – terabytes and gigabytes. The requirements of the application have changed. Modern applications are driving the need for low latency, real-time processing and the ability to process millions of requests per second. It’s not just a key-value store like DynamoDB, but an in-memory database like Amazon ElastiCache.
We are also actively helping companies to use machine learning. We have been working in this area for a long time, and, like other important advances, our initial attempts to externalize some of our early internal machine learning tools have failed. After years of roaming – experiments, iterations and improvements, as well as valuable insights from customers – enabled us to find SageMaker launched only 18 months ago.
SageMaker eliminates the tedious, complex and guesswork of every step in the machine learning process – democratizing AI. Today, thousands of customers are using SageMaker to build machine learning models on top of AWS. We continue to strengthen our services, including adding new intensive learning capabilities. Reinforcement learning has a steep learning curve and many activities, and so far, no one other than the most well-funded technical organization has access to it. If you don’t have a curious culture and are willing to try new things on behalf of your customers, this is impossible. Customers are responding to customer-centric embarrassment and listening – Now, AWS has achieved $30 billion in annual operations and is growing rapidly.