How our Azure Data Architect Unified & Automated Complex Data for a Real Estate Platform
The Client & Our Involvement
The client is the Head of Data Engineering for a global Commercial Real Estate company.
They were building a platform where brokers and agents could log in and see detailed and critical industry information such as price listings per area and operating costs on a live map.
The data to support this platform resided on different servers and in different formats located on-prem. They needed to translate this into a consistent, cloud-based architecture. To accomplish this, they would need to build several data pipelines to collect the raw data into a unified set.
Streamlining, cleaning, and formatting such a high volume and complex data flow required an expert Azure Data Architect to develop and automate the pipelines using various technologies.
We were able to find the right candidate for the company, and the project was tackled and finished on time.
The Challenge: Data Integration- Handling, Formatting & Automating Data
The main challenge was to adapt diverse business-related data from many internal & external sources into a custom data warehouse type of environment. The data was siloed and in incompatible formats. Pipelines need to be built to take the data, ingest it, and run processes to turn raw data into usable, standardized data. Once processed, this curated data could be used by this platform and others. Furthermore, anyone could tap into that data and create their reports.
This complex project required a Senior Azure Data Architect who can act both as a data architect and data engineer. The architect would need to have extensive experience to anticipate problems in the cloud that would not have occurred on-prem. He/she would also have to decide what happens when Azure Data Factory doesn’t support customization or when working with larger volumes of data.
As an added difficulty, Azure Data Architects and Engineers are in high demand. They get hired and go off the market quickly.
The Solution: Clearmont’s Approach
After understanding the requirements and challenges of this project, we presented four candidates to the client. These candidates were thoroughly pre-screened, making this initial roster more on-target compared to the other agencies’ first round of candidates.
From these four candidates, two were interviewed by the company and one was hired. This architect had the right technical skills along with the needed personality traits to undertake this challenge and proved it by successfully solving an interview test.
The Results: After Our Candidate Started
✔ More than 20 data pipelines were created and automated.
✔ The data was able to be stored in the cloud in a clean, consistent, and ready-to-be-used status.
✔ The client was able to take this from a minimal viable product into a scaled solution with over 1,000 users and high adoption rate.
✔ The architecture is unified, cloud-based, and scalable, allowing the company to handle larger amounts of data.
“My experience [with Clearmont] has been good in terms of the resumes provided. It’s not just a general pool of candidates that I have to sit and sift through to get to the right set of candidates. There’s a great depth of applied understanding that makes even the first batch of resumes very close to the final one.” – Director of Data Engineering.