<img alt="" src="https://secure.dump4barn.com/213374.png" style="display:none;">

Modernizing your analytics architecture part 2: Different approaches to sizing

Bilde av Terje Vatle
by Terje Vatle

12. Oct 2021, 5 minutes reading time

Modernizing your analytics architecture part 2: Different approaches to sizing

Sizing your architecture typically implies deciding on key components for data storage and data processing. For example, a database is typically well suited for governed data integration processes for structured data to enable reports and analytics. A data lake, on the other hand, is designed to for flexible storage formats, enabling bigger analytics use cases and cheaper data storage, but typically with less governance attached.  

In addition, you also have massive parallel processing (MPP) databases or other distributed processing systems well suited to take on the biggest workloads and analytical tasks. Some storage platforms such as Snowflake and the upcoming generation 3 of Azure Synapse Analytics can provide extensive scalability. However, there is typically a premium price attached, and you risk over-paying compared to solving your current business needs. 

The following typical approaches could be taken to size your architecture: 

  • Current requirements approach: Go for the right price-performance architecture based on current requirements. 
    • Pros: Initial price will be just about right for getting the right level of services delivered. 
    • Cons: If business needs change there is a risk of the architecture becoming obsolete. You may have to throw away previous work and undertake expensive, risky, and time-consuming rebuilding activities. It may also introduce costs, compliance and other downsides of legacy architecture. 
  • Future maximum approach: Go for an architecture you think will fulfill your future requirements 5-10 years ahead.
    • Pros: If your predictions of future business needs are accurate you would have a data platform with an extended lifetime 
    • Cons: At the beginning of the lifetime, you risk over-paying compared to what you need. You can even risk designing an architecture for which your organization isn’t ready, or for which your data volumes are too small, or your analytics ambitions are too weak to fully justify it. Also, there is no guarantee that your prediction of the future is correct. Business needs may change, but also the technology landscape. For example, new technology entrants may emerge that prove to be better matched to your needs. 

Neither the minimum nor the maximum approach alone allows for optimizing your TCO over time. Each of the two approaches would typically lock your analytics architecture into specific technologies for data storage and data processing. An MPP approach may scale well, but as a starting point it may prove expensive and overkill compared to your current analytics ambitions. 

Read the next blog entry to learn more about a continuous approach to right-sizing your analytics architecture.

Next: Modernizing your analytics architecture part 3: Continuous right-sizing

 

For more information on data warehouse automation, check out our online guide. 

New call-to-action

 

Terje Vatle

Terje Vatle

Terje Vatle is Chief Technology Officer at BI Builders following global market trends within data & analytics. Terje has a technology and advisory background, and focuses on how to make organizations achieve their goals by becoming more data driven.

Follow our blog


Our dedicated employees write professional blogs worth reading.
Follow the blog for a sneak peek at the future!

Others would also read


Modernizing your analytics architecture part 1: Key objectives

Modernizing your analytics architecture part 1: Key objectives

Picture of Terje Vatle

by Terje Vatle

12. Oct 2021, 8 min reading time

Modernizing your analytics architecture part 3: Continuous right-sizing

Modernizing your analytics architecture part 3: Continuous right-sizing

Picture of Terje Vatle

by Terje Vatle

12. Oct 2021, 5 min reading time