Software Data Operations Engineer
Company: MAQ Software
Location: Plano
Posted on: February 16, 2026
|
|
|
Job Description:
Job Description Job Description Software Data Operations
Engineer Plano (Dallas), Texas About MAQ Software MAQ Software
enables leading companies to accelerate their business intelligence
and analytics initiatives. Our AI-powered solutions enable clients
to improve their productivity, reduce costs, increase sales, and
build stronger customer relationships. Our clients consistently
recognize us for providing architecture and governance frameworks
using the Well-Architected Framework, implementing best practices
to optimize reports, and building team capability through training
programs. Clients choose to work with us because they are confident
in our software delivery. Our confidence results from a commitment
to consistent outcomes, reduced time to market, and a transparent
workflow. Clients benefit from daily software updates, agile
practices, domain expertise, AI adoption, and rapid feedback
implementation. As a premier supplier to Microsoft for over two
decades, MAQ Software clients gain extensive insights and
engineering practices across the Microsoft platform and can improve
their implementations with our breadth and depth of expertise. As
one of the top 25 global partners, Microsoft has awarded MAQ
Software eight specializations for meeting their highest standards
of service delivery to Fortune 500 companies. With over 1,800
engineers, MAQ Software has globally integrated teams in Redmond,
Washington; Plano, Texas; and Noida, Mumbai, and Hyderabad, India,
delivering solutions with increased velocity and tech intensity.
Our daily delivery and feedback model offers the flexibility to
adapt solutions to changing business needs. MAQ Software’s
dedication to customer success has led to sustained growth. Inc.
magazine has recognized MAQ Software for sustained organic growth
by listing us on the Inc. 5000 list twelve times —a rare honor.
Engineering culture At MAQ Software, we foster a strong engineering
culture with a can-do attitude. Our key managers come from
excellent educational backgrounds and have significant experience
in growing a company and mentoring software engineers. Due to our
smaller size, we are agile and able to adopt the latest
technologies and computing trends ahead of larger industry players.
As a part of our globally distributed engineering team, our
engineers gain exposure to the latest software engineering
practices and fast development cycles, providing them with the
opportunity to work on challenging technical problems that utilize
cutting-edge technologies for fast-paced software delivery. Our
collaborative and supportive work environment encourages innovation
and growth, making our company an exciting and rewarding place to
work. Examples of our projects: We built an agentic AI solution
that redefined event information access for a global technology
leader, delivering instant, accurate answers to attendee questions
at scale. Using Azure OpenAI Service, Azure AI Search, and Azure
Machine Learning, combined with Retrieval-Augmented Generation
(RAG), we engineered a secure, multilingual Copilot platform
capable of processing data from 22 file formats across multiple
sources. Our team leveraged Azure Databricks for advanced data
transformation, Azure Front Door for global load balancing, and
Microsoft Entra ID with Azure Key Vault for enterprise-grade
security. The result: a system that served over 200,000 users,
handled millions of interactions, and reduced support requests by
70%, setting a new benchmark for scalable, intelligent, and
personalized event experiences. We built a high-performance
reporting solution that redefined data refresh efficiency for a
leading office supply retailer, delivering near real-time insights
and dramatically faster report access. By integrating
Snowflake-managed Iceberg tables into Microsoft Fabric via OneLake
shortcuts and leveraging Power BI’s Direct Lake mode, we built a
scalable architecture that eliminated data duplication and
significantly reduced latency. Our team leveraged Azure Data Lake
Storage for external volume dumps, semantic modeling for
streamlined analytics, and Fabric F256 SKU with autoscale for
optimized compute. The result: report load times dropped by 80%,
refresh cycles shrank from 10 days to minutes, and infrastructure
costs fell by 60%, setting a new benchmark for enterprise-grade
reporting performance. We helped a global industrial technology
leader transform its construction operations by replacing a
fragmented, batch-based ERP system with a real-time intelligence
solution built on Microsoft Fabric. The previous setup—based on
Databricks and Snowflake—struggled with delayed data updates, high
maintenance, and siloed systems. By implementing Microsoft Fabric’s
Real-Time Intelligence capabilities, including Eventstreams,
Reflex, and Power BI dashboards, we enabled real-time data
ingestion, anomaly detection, and instant insights. The result:
20–30% cost savings, 50% less IT maintenance, and a 10x improvement
in data latency. This empowered faster decision-making, proactive
issue resolution, and a more agile, data-driven organization. We
helped a Fortune 500 office supply retailer modernize its reporting
by migrating from MicroStrategy to Power BI. The legacy system was
costly and inefficient, requiring over 150 duplicate reports and
individual licenses for each user. Our solution consolidated these
into just five dynamic Power BI reports, each customizable with
bookmarks and filters. By leveraging Snowflake and Import Mode, we
delivered a scalable semantic model and a user-friendly experience.
The result: reduced license costs, faster reporting, and empowered
business users with self-service analytics. We built an AI-powered
chatbot for a global manufacturer of industrial test and diagnostic
equipment to improve access to project lifecycle management data.
Using Azure OpenAI, Azure AI Search, and Azure App Service, the
solution indexed documents from SharePoint—including PDFs, Excel
files, and dynamic web pages—into a centralized, searchable
interface. Power Automate and custom code streamlined data
ingestion and processing, while Azure Cosmos DB stored historical
interactions. The chatbot enabled employees to retrieve accurate
insights quickly, reducing manual effort and improving
decision-making. Post-deployment, the client achieved higher
productivity, lower operational costs, and seamless access to
project data through a single unified interface. To read about some
of our recent projects, visit https://maqsoftware.com/case -studies
Responsibilities Analyze existing systems (30%) Collect requirement
specifications to analyze business processes and determine the
exact nature of user’s system requirements. Use tools like
Microsoft 365 Copilot to summarize stakeholder input and map
process flow from documents, emails, and meetings. Collaborate with
module leaders and core team members to decide on system
architecture. Analyze existing system structures to identify
opportunities for migration to cloud-based platforms. Use GitHub
Copilot to assist in refactoring legacy code and Microsoft Azure AI
services to evaluate cloud readiness and performance benchmarks.
Analyze user requirements and align them with available enterprise
data sources to design solutions that deliver reliable performance
and reasonable cost. Use tools such as GitHub Copilot to assist
with query optimization and Azure DevOps MCP Server to validate
data lineage and dependencies within project repositories. Design
processing steps and recommend system solutions based on user
requirements, ensuring clarity and scalability. Develop
specifications and workflow (25%) Prepare software specifications,
flow charts, and process diagrams for software programmers to
follow. Use GitHub Copilot to generate code templates and automate
systems documentation such as design specifications, user manuals,
technical manuals, descriptions of application operations, and
methodology documentation. Analyze feasibility using commercially
available software systems (e.g., Microsoft Azure versus Amazon Web
Services) and reporting systems (e.g., Power BI versus Tableau).
Analyze and verify implementation (25%) Collaborate with systems
analysts and programmers to develop data migration tools and define
operational workflows for new systems. Work with software
developers to use GitHub Copilot for accelerating code development.
Set up the test environment, use GitHub Copilot to generate test
cases, and use Playwright MCP to automate testing and compare data
from multiple sources to verify reports for end users. Review
implementation status and reporting (10%) Participate in technical
collaboration meetings and periodical reviews of implementation
status. Report weekly task plan to the project management team for
implementation of custom software. Training and certifications
(10%) Participate in technical training and complete relevant
industry courses and certifications. Qualifications Undergraduate
or graduate degree in Computer Science, Information Systems,
Applied Computational Math Sciences, or related engineering
discipline. Benefits Annual salary range depends on education and
experience. Comprehensive medical, dental and vision insurance with
employee premiums paid in full. 401(k) retirement plan with 3%
company match and immediate vesting Paid time off (three weeks)
Keywords: MAQ Software, The Colony , Software Data Operations Engineer, IT / Software / Systems , Plano, Texas