On April 13, the new AWS Data Analytics Specialty certification journey officially began – before the beta phase in December 2019 / January 2020. It coincided in time with the AWS Database Specialty Beta, which forced me to choose between them. Finally, I decided on taking the Databases Specialty, as I had recently tested from AWS Big Data.
The “Beta exam” experience is very different from the “standard” one: 85 questions and 4 hours long – that is, 20 questions and one more hour – a really intense experience. I recommend taking a 5-minute break – in the centres, they are allowed – since, after the third hour, it is challenging to stay focused.
The certification is the new version of AWS Big Data Specialty, an exam that will be withdrawn in June 2020. I will not go into much depth on the differences; suffice it to say that the domain of Machine Learning has been eliminated, expanding and updating the rest of the domains in depth. But beware, Machine Learning and IoT continue to appear integrated into the other domains; therefore, it is necessary to know them at an architectural level, at the very least.
Prerequisites and recommendations
I will not repeat the information that is already available on the AWS website; instead, I will give my personal recommendations and observations, as I consider the Learning Path that AWS suggests being somewhat light for the current level of the exam.
- AWS experience at the architectural level. The exam is largely focused on advanced architecture solution – 5 pillars – and to a lesser extent on development, present mainly in services such as Kinesis and Glue. I recommend owning the AWS Architect Solutions Pro certification or the AWS Architect Associate + AWS Security Specialty.
- Advanced AWS security experience. It is a complete domain of the exam but can be found – cross-domain – in many questions. If you own the AWS Architect Solutions Pro, general security knowledge may be sufficient – not the specific certification knowledge for each service. Otherwise, the AWS Security Specialty is a good option, or equivalent knowledge in certain services – that I will indicate later on.
- Analytics knowledge. Otherwise, I´d recommend studying books such as “Data Analytics with Hadoop” – O’Reilly 2016, or taking the courses indicated in the AWS Learning Path. Likewise, carry out laboratories or pet projects to obtain some practical experience.
- Hadoop´s ecosystem knowledge. Connected to the previous point. High-level and architectural knowledge of the ecosystem is a must: Hive, Presto, Pig, …
- Knowledge of Machine Learning and IoT – AWS ecosystem. Sagemaker and core IoT services at the architectural level
The questions follow the style of other certifications such as AWS Pro Architect or Security or Databases Specialty. They are all “scenario-based”, long and complex – most of them. You are not going to find many simple questions. Certainly, between 5% and 10% of “easy” questions appeared, but all in a “scenario” format.
Let’s have a look at an example taken from the AWS sample questions:
I´d classify this question as to the “intermediate” level of difficulty. If you have taken the Architect PRO or some speciality such as Security or Big Data, you will know what I am talking about. Certainly, the questions’ level is much higher and deeper than in the previous version of the exam.
I´d recommend doing the new speciality directly, as the old one contains questions about already deprecated services – or outdated information.
!! — Mentee Zone — !!