Staff Spotlight: Elizabeth Voeller


Meet Elizabeth, Cybersecurity Director

Our people are at the core of our business. We are featuring the amazing individuals who make EGlobalTech an exciting and supportive place to work.

Elizabeth is one of EGT’s leaders, serving as a Director in our Cybersecurity Practice. In addition to many corporate and professional activities, she provides program management support to Federal enterprise-level, cybersecurity programs. With an eye on expanding our cybersecurity service offerings, her strategic vision has impacted our ability to excel. She also helped establish the “751 Book Club”, part of the EGT Women’s Initiative, which pays tribute to our late founder, Sonya Jain, who loved to read (751 was Sonya’s office number).

Within the professional community, Elizabeth is a regular speaker for the Potomac Forum’s “Cyber Security in Government” Workshop, which educates and motivates Federal government leaders to make positive changes to keep their organizations secure. In 2011, she was also elected by her peers as the Vice President of Programming for MADRA, the Mid-Atlantic Disaster Recovery Association. In 2015, that role expanded, and she now serves as MADRA’s Director of Operations.

When Elizabeth isn’t in the office, she likes to travel internationally, read, spend time at the beach with friends, dance Bachata, explore new vineyards on her husband’s Harley, and train for triathlons. She is the favorite aunt to 7 nieces and nephews back in her home state of North Dakota, with another on the way! She shares life’s adventures with her husband, Rob, and their two pups, Lilah and Piper.

What Is Human-Centered Design?


What is Human-Centered Design?

Human-Centered Design (HCD) provides an interactive solution development approach that focuses on the users. Leveraging deep knowledge of user needs and requirements, these iterative techniques enable prototyping user-focused solutions and continuous feedback. Above all, HCD aims to deliver focused and meaningful solutions.

Because solutions rely more on advances in artificial intelligence, automation, and robotics, it is crucial to capture and maintain focus on the human aspect of innovation. Without clearly defining the challenge and capturing focused requirements, these kinds of technology solutions can be wasteful, costly, and frustrating to the end-users. HCD strives to create innovative products, services, and solutions through creative and collaborative practices.

How does the HCD approach work?

The HCD process has three phases that engage the customer throughout the entire solution development lifecycle: the Inspiration Phase, the Ideation Phase, and the Implementation Phase.

First, the Inspiration Phase focuses on learning directly from the client through immersion in their environment. This phase is focused on in-depth research and the identification of requirements. The Inspiration Phase is about adaptive learning, being open to creative possibilities, and trusting that by adhering to end-user needs, the ideas generated will evolve into the right solution.

Next, the Ideation Phase contains two parts: Synthesis and Prototyping. Synthesis brings together the needs and requirements learned during the Inspiration Phase and organizes them into themes, insights, and potential solutions opportunities. The second part of the Ideation Phase is Prototyping; expanding outputs from Synthesis into testable processes, products or services. Synthesis and Prototyping form a cyclical process of testing prototypes, getting feedback, and iterating that is key to creating an effective, innovative solution. This approach is designed to test the desirability, feasibility, and viability of solutions with end-users on a small scale with minimal risk.

Finally, during the Implementation Phase, the complete functional solution is developed and executed. During this phase, special attention is paid to how the decided upon solution will impact the client environment and how it will be implemented. Even after implementation, HCD encourages ongoing feedback and continuous refinement of the concept.


Using a human-centered approach to design and develop solutions results in substantial benefits for organizations and end users. While user-centered design focuses on improving the interface between users and technology, HCD concentrates on actively involving the end-users throughout the development process. As a result, solutions designed using human-centered methods increase usability and productivity while improving quality and user experience. HCD is more successful by giving ownership and control of the solution to the customer.

EGlobalTech and HCD

At EGlobalTech, we leverage the creative and iterative aspects of human-centered design to develop and implement optimized solutions based on direct user engagement.

Would you like to find out how EGT can employ HCD methods for your organization? Contact us today.

Want to learn more about HCD? Our Human-Centered Design Delivers Focused and Meaningful Solutions White Paper provides more details on the approach.

Staff Spotlight: Michelle Durante


Meet Michelle, Senior Recruiting Manager

Our people are at the core of our business. We are featuring the amazing individuals who make eGlobalTech an exciting and supportive place to work.

As the Senior Recruiting Manager at eGlobalTech, Michelle Durante leads the recruiting team and finds top talent to join our team. Drawn to eGT by the opportunity to build on the company culture and make an impact, Michelle knows the work her team does truly matters. Whether Michelle is finding the right people to support an agency mission or to prevent a cyber-attack, her team’s work has a direct, positive impact on the American public.

Michelle broke into recruiting after hearing the CEO of a woman-owned staffing company speak while she was studying business at Towson University. She reached out to the speaker, who offered Michelle a job and launched her career in recruiting. Building on this experience, Michelle’s biggest advice to the next generation looking to get into recruiting and consulting is: “Don’t be afraid to get up from behind the computer to connect.” That personal connection can make the difference, which is why she and her team host several networking events a year to build out eGT’s talent community.

When outside of work, you can find Michelle working on her garden and collection of indoor plants with her fiancé Josh, playing with her dog Colby, and enjoying time with friends and family. She also serves as the Women in Technology (WIT) Board Member at Large for Sponsorship and Strategic Partnerships, supporting the organization’s mission of advancing women in technology from the classroom to the boardroom.


How To Leverage Data Lakes For Your Organization

How To Use Data Lakes Blog Image Numbers On Green Background

Using Data Lakes To Generate New Insights From Data and Build Data Capabilities


As data lakes have increased in popularity over the past five years, cloud providers started offering data lake capabilities to make it easier to ingest, store, and centralize data across an organization. Most organizations benefit from data lakes as data is often stored in dozens to hundreds of disparate data environments, and the data can be represented in various formats, including structured and unstructured data. Data lakes enable organizations to store their data in a centralized, virtual data platform. Centralization provides many benefits, including predictive analytics, Artificial Intelligence (AI) capabilities, data storage across multiple data types (including streaming data and images/videos), data discovery and cognitive search, and granular data security controls.

What Is A Data Lake?

A data lake is a virtual, centralized repository that stores data from across an organization, regardless of data format, structure, or type. A data lake sends and receives data from any database, data warehouse, or API. Its virtual environment permits organizations to move towards data centralization for analytical purposes without decommissioning existing databases. This enables organizations with existing data systems to leverage the benefits of a data lake without deactivating existing or legacy systems. Once the data lake ingests and centralizes the data, data scientists and AI practitioners can build and deploy powerful AI analytics and solutions. These solutions will generate new insights, patterns, and relationships among all the integrated data.

Data Lakes Power Artificial Intelligence

As disciplines within AI continue to advance, the possibilities for both federal agencies and commercial organizations are endless. Perhaps most exciting are the continuously evolving fields of machine learning (ML) and deep learning, empowering companies to generate new insights from unstructured data. The problem that agencies face when they feel ready for AI, ML, and deep learning is that their data isn’t in a single environment where AI algorithms can easily access it. Data lakes power AI capabilities for this reason. Without data centralization, newly uncovered relationships, patterns, and correlations can’t be deduced as the AI algorithms will only have a smaller subset of data to work with. To build an AI model with a reasonable accuracy rate, high volumes of data are required, which can be stored within a data lake.

What Is A Data Lake Not?

Data lakes are not data warehouses. A data warehouse requires data to be pre-categorized and tagged before storage. Data lakes are flexible and can ingest and store data in its as-is format. With that said, it’s important to note that Extract, Transform, and Load (ETL) operations will need to occur to get data sets prepared for data analytics, ML, or AI models. AI and ML require data to be prepared in specific formats that predictive algorithms will understand, so it’s important to consider ETL when building analytical models. This is the case regardless if you’re using data stored in a data lake, data warehouse, or SQL environment.

Examples Of Capabilities That Can Be Built Using Data Lakes

Create A Google-like Search Engine Within The Data Lake

Following data centralization, agencies can index their data and create a search engine that returns relevant search results quickly while providing a user-friendly search experience.

Protect Data At The Column Or Cell Level

Data lakes enforce roles and access policies for every unique data set, including protecting data within an individual table column. This tactic enables granular data protections across various types of data. Additionally, it ensures the same security policies are carried over from the original data source to the data lake.

Implement Data Governance And Provenance

Knowing how data is accessed and disseminated across an organization is critical to enforce proper governance of the data. With capabilities like Cloudera Data Flow that integrate smoothly into Hadoop-based data lakes, an agency can visually track key data sets and see who’s accessing data, how, and what they’re doing with the data. This provides complete operational oversight into the data lifecycle and offers provenance once new data sets are ingested.

Want To Learn More?

Data lakes can empower your organization to execute new types of analytics to make faster, more informed decisions. Our AI experts at eGlobalTech implement data lakes for on-premise systems and cloud providers. This guidance enables our clients to securely store data while harnessing more data in less time. Our Senior Director of Technology Strategy and Head of eGT Labs, Jesus Jackson, will present at the O’Reilly Software Architecture Conference on data lake implementation and data lake use cases.

Have questions? Contact us at egtlabs@eglobaltech to find out how eGlobalTech can deploy data lakes to support your organization.

eGlobalTech Recognized as Best of Vienna


2019 Best of Vienna Award

eGlobalTech (eGT) is pleased to announce that we’ve been named 2019 Best of Vienna for Management Consulting. eGT, a Tetra Tech Company, is a leading provider of innovative IT, cyber, and management consulting services for the federal government.

The Vienna Award Program is an annual awards program honoring the achievements and accomplishments of local businesses throughout the Vienna area. Recognition is given to those companies that have shown the ability to use their best practices and implemented programs to generate competitive advantages and long-term value.

“There are several management consulting firms in our area, and we are honored to be chosen as ‘Best of Vienna’ in this category.  This is truly a testament to our employees and their expertise in supporting our clients’ mission objectives,” said Branko Primetica, eGT’s President and Chief Strategy Officer.

For More Information, Contact:
eGT Public Relations

Announcing Product Update: Cloudamatic 3.0

, ,
Cloudamatic Logo

We are pleased to announce a major product update to eGlobalTech’s Cloudamatic®, a scalable open source solution for automating the complete deployment and orchestration of infrastructure, security, configuration, and provisioning for any application to the cloud.  Used in both the public and private sectors, Cloudamatic shortens application migration cycles from weeks to a single day.

With release 3.0, Cloudamatic includes the following features:

  • Complete Microsoft Azure Migration & Deployment Support
  • Containerization Support Across All Cloud Providers
  • Kubernetes Support Across All Cloud Providers

Find out more about these features here.

eGlobalTech Has Moved!


We have exciting news – we’ve moved! To accommodate business growth, eGlobalTech relocated our corporate headquarters from Arlington to the Tysons Corner area on September 16, 2019. Our new headquarters is located at 1900 Gallows Road, Suite 800, Vienna, Virginia.

The new Tysons headquarters will provide a contemporary workplace, including a dedicated space for eGT Labs (our R&D arm). Consolidating four office locations to one headquarters, our new space will empower employees to easily work together and solve problems more efficiently, enhancing customer experience. The new facility will support eGT’s anticipated future growth.

Read the full press release for more details.


eGlobalTech Launches Artificial Intelligence Solution Auxilium


Arlington, VA, July 30, 2019— eGlobalTech, A Tetra Tech Company, is pleased to announce the launch of Auxilium, eGlobalTech’s premier Artificial Intelligence (AI) solution. Auxilium is an open source chatbot solution which answers internal and external stakeholder questions efficiently and effectively, empowering teams to focus on higher-level tasks and complex business problems.

An innovative and impactful tool for both federal and commercial organizations, Auxilium:

  • Understands intent
  • Answers questions in milliseconds
  • Monitors impact through a customizable dashboard of detailed analytics
  • Supports your brand through tailored answers and the interface’s aesthetic
  • Integrates seamlessly with existing tools, including Salesforce and Slack
  • Replies to voice commands, increasing accessibility
  • Assists stakeholders securely with data protection and access control
  • Avoids significant changes to your system, negating the need for an Authority to Operate (ATO)

Jesus Jackson, eGlobalTech’s Head of eGT Labs said, “Our chatbot is designed with the latest in machine learning and AI technology to troubleshoot questions while reducing your team’s workload. With Auxilium’s support, your employees can focus on the bigger picture.” eGlobalTech’s AI chatbot has endless possibilities for organizations, from relieving a help desk to assisting with internal training to supporting web portal service functions.

Read the full press release here.

eGlobalTech Awarded GSA OGP Program and Operations Management Contract


Arlington, VA, May 2, 2019— eGlobalTech, a Tetra Tech company and a leading provider of innovative Information Technology (IT), cybersecurity, and management consulting services for the federal government, won the General Services Administration (GSA) Office of Government-wide Policy (OGP) recompete contract to provide project and operations management support services.

The purpose of this contract is to provide comprehensive project and operations management support across all OGP offices, programs, and services. Support requirements include but are not limited to; end-to-end project management using agile techniques; insightful program briefings; performance metrics collection and analysis; risk management; and executive meeting support.

Joseph Zimmerman, eGlobalTech’s Chief Operating Officer said, “We are very excited to continue to provide these critical services to GSA. We greatly value the trusted relationship and appreciate the partnership we have developed with GSA. We have delivered meaningful results and will continue to increase program performance and staff our team with highly talented and skilled personnel that will exceed GSA’s expectations.”

Read the full press release here.

Techniques for Designing Microservices

Two people designing

Part 2 of our Microservices & Containers Series

In our “Part 1: Microservices & Containers – A Happy Union” blog post, we outlined the benefits of microservices and described how to integrate them with containers to enable teams to build and deploy microservices in any environment. In the next phase of our two-part blog series, we explain approaches for defining and designing flexible microservices. When designing microservices, it’s important to ensure that each service is decomposed into a business capability. As microservices follow a lean and focused philosophy, designing microservices around business capabilities ensures no unnecessary features or functionality is designed or built. This reduces project risk, the need to refactor unnecessary code, and reduces the complexity of the overall product. Since microservices are built around business capabilities, it’s critical to have business stakeholders or users participate in the design sessions.

Defining Microservices

It’s tempting to start implementing small services right away and assume when combined, all services will represent a cohesive and modular product. Before diving into the implementation, it’s critical to understand the complete picture of all services and how they interact with one another to avoid feature creep and unnecessary features that don’t meet business needs. An effective approach is to have key technical staff (usually a lead designer, technical lead, and architect) and stakeholders collaborate using event storming. Event storming enables project implementors and domain experts to describe an entire product or system in terms of the things or events that happens. This empowers both the business and technical staff to have complete control of a problem space and design product services using plain and easy-to-understand descriptions rather than technical jargon.

Using post-it notes, the team arrange events in a rough order of how they might happen, although not at first considering in any way how they happen or what technologies or supporting structure might be present in their creation. Events should be self-contained and self-describing with no concern placed on implementation details. When doing this exercise, it’s helpful to draw a causality graph to explore when events occur and in what order. Once all events are documented, the team then explores what could go wrong in the context. This approach helps you ask the question “What events do we need to know?” and helps you identify missing events, which is a powerful technique to help explore boundary conditions and assumptions that might affect real estimates of how complex the software will be to build. When the team feels all events have been adequately documented, the next step is to document user personas, commands, and aggregates.

  • User Personas
    • User personas document the various users that would use the system. Personas help teams understand the goals of the user performing a given action, which is helpful in the design phase.
  • Commands
    • Commands are a user action or external system that caused an event.
  • Aggregate
    • An aggregate receives commands and decides whether to execute them or not, thus producing other events as necessary.

Once all personas, commands, and aggregates are documented, the team can now see “big picture” on how the entire system or product should work to meet all requirements. This approach is excellent when designing microservices as each event or handful of events can be clearly defined for a microservice. The service author creates a service that accommodates only those events, creating lean business capabilities that have a well-defined scope and purpose. Event storming is also great for both technical and non-technical stakeholders, as the entire system is described by its events. This removes barriers for stakeholders to participate in the design process as the technical implementation details are not discussed. This approach works well for an existing system or new application.

Design Techniques for Microservices

Once a team has all their services defined and organized, they can focus on the technical details for each microservice. The implementation details will be specific to a given service, and below are guidelines that will help when building out a microservice:

  • Develop a RESTful API
    • Each microservice needs to have a mechanism for sending and consuming data and to integrate with other services. To ensure a smooth integration, it’s recommended to expose the API with the appropriate functionality and response data and format.
  • Manage Traffic Effectively
    • If a microservice requires the handling of thousands or millions of requests from other services, it will not be able to handle the load and will become ineffective in meeting the needs of other services. We recommend using a messaging and communication service like RabbitMQ or Redis to handle traffic load.
  • Maintain Individual State
    • If it’s necessary for the service to maintain state, then that service can define the database requirements that satisfy its needs. Databases should not be shared across microservices as this goes against the principles of decoupling and database table changes in one microservice could negatively impact another service.
  • Leverage Containers for Deployments
    • As covered in Part 1, we recommend deploying microservices in containers so only a single tool is required (containerization tools like Docker or OpenShift) to deploy an entire system or product.
  • Integrate into the DevSecOps Pipeline
    • It’s important that each microservice maintain their own separate build and be integrated into the overall DevSecOps CI/CD pipeline. This makes it easy to perform automated testing on each individual service and isolate and fix bugs or errors as needed.

How eGlobalTech Can Help You Deploy Microservices

As outlined in Part 1 of our blog series, eGlobalTech has extensive past performance developing and deploying microservices for multiple clients. Our experience includes containerization through Docker and OpenShift, and we have leveraged containers to deploy microservices across many complex applications. Our Technology Solutions group built and integrated microservices on existing legacy applications, developed new applications using microservices, and migrated legacy architectures to complete microservices-driven architectures. If you’d like to discuss how eGlobalTech can help your organization embrace or implement microservices, please email our experts at!