Mage Blog

Cole Freeman

July 23, 2024

Cole Freeman

July 23, 2024

This article explores how Mage addresses the challenges of financial intuitions through features like modular architecture, configurable rules engines, and scalable processing. It also outlines best practices for maintaining regulatory flexibility, including regular reviews, cross-functional collaboration, and continuous monitoring.

Cole Freeman

July 16, 2024

Cole Freeman

July 16, 2024

Conditional blocks in Mage are powerful tools for creating dynamic, decision-making data pipelines. This article explains how to implement conditional blocks, using a banking example of processing Suspicious Activity Reports (SARs) for transactions over $10,000.

Cole Freeman

July 9, 2024

Cole Freeman

July 9, 2024

This article explains data integration’s importance in modern business and introduces five leading platforms: Informatica PowerCenter, Fivetran, Prefect, Talend, and Airbyte. It outlines some key features from each too, highlighting various approaches to data integration — from enterprise solutions to open-source alternatives. These platforms offer diverse capabilities like visual workflow design, automated schema management, and extensive pre-built connectors. By exploring these tools, readers gain insights into current data integration technologies and how they address complex data management challenges in today’s data-driven business landscape.

Cole Freeman

June 25, 2024

Cole Freeman

June 25, 2024

Organizations face challenges managing vast amounts of fragmented data. Centralized data systems using integration pipelines and incremental models offer a practical solution. These systems unify data, improve quality, and enhance efficiency. Incremental models process only new or updated data, reducing computation time and costs. This approach enables faster decision-making, better resource optimization, and improved analytics capabilities. While implementation can be complex, the long-term benefits make it a valuable strategy for organizations dealing with large-scale, frequently updated data.

Cole Freeman

June 19, 2024

Cole Freeman

June 19, 2024

Global hooks in Mage are a powerful feature that allow executing custom code before or after API operations. They provide flexibility to extend functionality, integrate with external systems, validate data, and more across different components of your application. With targeting conditions and asynchronous execution, global hooks offer granular control and performance optimization.

Cole Freeman

June 11, 2024

Cole Freeman

June 11, 2024

In this tutorial, we integrate dbt with Mage to create a data pipeline, moving data from a source to a PostgreSQL database and performing SQL transformations through staged models. By setting up Docker and PostgreSQL, and following a step-by-step process, we effectively manage data orchestration and analytics using Mage and dbt.

Cole Freeman

June 4, 2024

Cole Freeman

June 4, 2024

Backfilling integrates historical data into data pipelines, ensuring completeness and mitigating failures. Mage provides a no-code UI and custom coding options to streamline backfilling for robust, resilient data pipelines.

Matt Palmer

February 9, 2024

Matt Palmer

February 9, 2024

Mage now supports a suite of DuckDB & MotherDuck features— from reading and writing DuckDB databases to executing dbt with dbt-duckdb!

Thomas Chung

December 12, 2023

Thomas Chung

December 12, 2023

Meet the magical members and the real life users of the Mage Community! Read about their experiences, insights, and success in using Mage from project milestones, productivity boosts, or game changing features.

Shashank Mishra

September 11, 2023

Edit: September 27, 2023

Shashank Mishra

September 11, 2023

Edit: September 27, 2023

When DBT (Data Build Tool), a renowned data transformation framework, converges with Mage, it amplifies ETL processes to new heights. This article explores this powerful synergy which facilitates enhanced data modeling, giving businesses a competitive edge. By tapping into this combo, data professionals can supercharge their operations and drive data precision to the forefront.