Our white papers provide insights into technology and cyber trends and innovations in the public sector.

Human-Centered Design Delivers Focused and Meaningful Solutions

,

Human-Centered Design (HCD) begins with obtaining a deep understanding of customers’ needs and leveraging creativity and continuous feedback to better realize useful and tailored solutions. HCD provides a framework for connecting the best ideas to actual user needs – thereby producing successful solutions that last. 

BACKGROUND

Too often service and solution providers make assumptions about the wants and needs of their client base, trying to overlay a technological solution to an issue without context. As solutions rely more on advances in artificial intelligence and robotics, it will be crucial to capture and maintain focus on the human element. Without direct interaction with the client to empathize and fully understanding the issue in context of their environment, organizations may develop solutions that do not fully respond to or resolve end-user business needs or requirements. These extraneous solutions can be wasteful, costly, and frustrating to the end-users.

HCD facilitates an interactive development approach aimed at making systems more usable and useful by focusing on the users, their needs and requirements, and by applying human factors, usability knowledge, and iterative techniques. HCD strives to create innovative products, services, and solutions through creative and collaborative practices.

In an agile environment, organizations can no longer rely on traditional approaches. By employing HCD, developers can create solutions that align to a customers’ values and build products or services that are more effective and intuitive for them to use.

HUMAN-CENTERED DESIGN ENGAGES THE CUSTOMER THROUGHOUT THE ENTIRE LIFECYCLE

The HCD process has three phases – the Inspiration Phase, the Ideation Phase, and the Implementation Phase.

gears showing Human Centered Design process

Inspiration Phase

During the Inspiration Phase, the focus is on learning directly from the client through immersion in their environment. The Inspiration phase is about adaptive learning, being open to creative possibilities, and trusting that by adhering to the needs of the client, the ideas generated will evolve and result in the right solution.

Ideation Phase


Synthesis

Synthesis brings together the needs and requirements learned during the Inspiration Phase and organizes them into themes and insights. The outputs from Synthesis are used to identify and target the best ideas for development into opportunities to prototype and test.

Prototyping

Following the Synthesis of ideas into opportunities, the second part of the Ideation Phase is prototyping; expanding ideas into testable processes, products or services. This cyclical process of testing prototypes, getting feedback, and iterating is important to create an effective, innovative solution in the end. HCD leverages the prototype or pilot approach as an important tool designed to test the desirability, feasibility, and viability of solutions with clients at a small scale with minimal risk.

Whereas user-centered design focuses on improving the interface between users and technology, HCD concentrates on actively involving the client recipient all throughout the improvement and development process.

Implementation Phase

During the Implementation Phase, special attention is paid to how the decided upon solution will impact the client environment and how it will be implemented. Long-term success may require incremental change, therefore understanding the target audience and considering change management are paramount.

HCD IS MORE SUCCESSFUL BY GIVING OWNERSHIP AND CONTROL OF THE SOLUTION TO THE CUSTOMER

Even after a solution is implemented, HCD encourages iterative, post-implementation feedback gathering and continuous refinement of the concept to best meet the end user’s needs.

Using a Human-Centered approach to design and develop has substantial benefits for IT organizations and end users. Highly usable systems and products tend to be more successful both technically and from a usability perspective. Solutions designed using human-centered methods improve quality by:

Increasing productivity of users and operational efficiency of organizations
Creating systems that are more intuitive, reduces training and support costs
Increasing usability for users with a wider range of capabilities, increasing accessibility
Improving user experience, reducing discomfort and stress.

Contact us at info@eglobaltech.com to find out how you can build successful technology solutions with our HCD framework!

 

Download White Paper Button

 

Copyright 2018 | eGlobalTech | All rights reserved.

Modernizing Legacy Applications With Microservices

BACKGROUND

Front page microservices white paper imageIn both commercial and government information technology (IT), it is common to see large, monolithic applications grow in size and scope at a rapid pace to a point where the application becomes nearly unmanageable, unsustainable, and unable to adapt to changes in the business environment. The more successful an application is initially, the more it is likely to grow in its capabilities until the application operates well outside of its original planned scope and is burdened by too many features. Many of these solutions are developed as monolithic applications where most features are tightly coupled into a singular environment. In situations like this, the monolithic environment may become an impediment to progress, slowing down the software engineering team’s ability to enhance applications and resolve defects. Microservices present a new design approach that keeps solutions small, modular, and promotes extensibility. These benefits are further enhanced when paired with serverless computing which adds tremendous scalability and performance, and drastically reduces the cost of ownership with little effort.

WHAT ARE MICROSERVICES AND SERVERLESS COMPUTING? WHY DO THEY MATTER?

Microservices are discreet, self-contained, easily deployed, and easily managed services that operate on their own but are designed to be integrated into larger solutions. The key to microservices is that they serve a specific, well-defined purpose, typically determined by decomposing business capabilities down to a single verb, noun, or use case. A microservice is built to operate in its own environment and communicate with other microservices using highly efficient and loosely-coupled communications. They are usable by web, desktop, and mobile applications as well as other microservices.

Serverless computing enhances microservices by providing an ideal platform that scales up and down instantly and with little to no interaction. Serverless computing differs from standard cloud web hosting in that the developer does not need to manage the server. The code is deployed to endpoints and the service fabric then completely manages the code. In times of high utilization, the service fabric increases the number of instances to meet performance demands. As the workload decreases, the service fabric automatically decreases the number of instances. Serverless computing is an ideal underpinning for highly modular solutions such as microservices. As a bonus, serverless computing service charges are incurred only while they are running. Combined, microservices architecture and serverless computing provide highly secure, extensible, and scalable environments while also providing tremendous cost efficiency.

HOW ARE THEY DIFFERENT FROM DEVELOPING CODE COMPONENTS?

Many developers attempt to build modular systems using code-level components. These components are dropped into existing applications and compiled and deployed with each application they are used in. On the positive side, this will result in some reuse and will also provide good performance. On the negative side, this approach is not as reusable as microservices and it leads to configuration management challenges where different projects may require different versions of the component. Once a library is updated, applications that use the components must be updated, recompiled, and redeployed, even if it is for a minor change.

Microservices are self-contained. They are built and compiled as a separate project. They are deployed in their own containers and typically have their own databases. They are not compiled directly into solutions but are standalone services designed to be connected via standards-based, loosely-coupled communications protocols such as Representational State Transfer (REST) and JavaScript Object Notation (JSON). In most cases, microservices are carefully versioned, allowing different applications to continue to operate even if a newer version of the service is released and without necessarily requiring recoding, redeployment, and rework.

For example, a microservice may be developed to provide a catalog of products, another to focus on managing procurements, and another to manage payments. These microservices might be built independently but also provide the core for an end-to-end solution for end-users to browse for products, manage acquisitions, arrange payments, and track progress. When combined, these atomic services can serve many other types of solutions and provide a set of “single truth” per business subject area that can be connected to manage entire business processes. Agile development teams with experience in developing microservice architecture-based solutions will have a keen eye for decomposing solutions into discreet service and connecting them efficiently and extensibility.

APPLY MICROSERVICES AND SERVERLESS BEST PRACTICES

The best approach to getting started with microservices is to identify the right application to modernize that will help establish and capitalize on early successes. The application should be in the right business domain, and with the right contractor to guide your organization through the pilot project. To assure success in the first microservices initiative, the following best practices apply:

  • Identify an application that supports a well-understood business domain that is easily decomposed into microservices.
  • Identify an application that supports processes that are well defined. Having to wait for business process re-engineering may slow the effort.
  • Identify an application that supports processes that are well defined. Having to wait for business process re-engineering may slow the effort.
  • Target applications that already have funding for development or modernization if possible.
  • Engage experts in microservices and serverless solution design with demonstrated experience in fielding microservices.
  • Build and deploy the solution using leading-edge, serverless design platforms such as Amazon Web Services (AWS) Lambda or Azure Functions.
  • Employ compatible development processes, including DevOps and continuous integration/continuous delivery (CI/CD) to achieve maximum value as early as possible.

By transforming a project leveraging microservice architecture on a serverless computing environment and by building it using experts in DevOps and CI/CD, an organization can see excellent results early and often. The result will be faster delivery of actual value, rapid and consistent deployment of working code, lower cost of operations and maintenance, and a higher degree of reusability.

THE POTENTIAL PITFALLS OF MICROSERVICES

Technology employed by inexperienced personnel can lead to undesirable results and disappointment―microservices is not an exception. A common mistake is not carefully defining the scope and purpose of the microservice, allowing it to grow into its own monolith. Tightly binding the services together and allowing too many dependencies makes microservices brittle, prone to refactoring, and reduces extensibility. Conversely, making microservices overly granular creates microservice sprawl. To combat this, federal IT organizations should enlist contractors with proven experience in decomposing applications into microservices. For maximum benefit, the contractor should also possess a proven background in DevOps, CI/CD, and automated cloud deployment.

CASE STUDIES

eGlobalTech (eGT) is supporting the transformation of a mission-critical, Oracle-heavy system at the Department of Homeland Security (DHS) that is utilized for emergency communications to a light-weight microservices architecture. We applied the strangler pattern to incrementally refactor existing application code into microservices and deployed to AWS Lambda, a serverless deployment model. The combination of microservices and serverless is enabling eGT to save the customer from expensive Oracle licenses and optimize the cloud expenditure while simplifying operations and maintenance processes.

At Health and Human Services (HHS), our customer was challenged with managing and maintaining more than six systems for a diverse number of grants programs. eGT implemented a new grants performance management platform applying microservices principles replacing existing systems and consolidating all the grants programs into one common platform. By engineering a flexible design, we can declaratively on board new grants programs and changes to existing program requirements with no changes to underlying software. eGT deployed the first production release in less than four months to Microsoft Azure, and since then has continuously delivered new features and capabilities through pipeline automation.

CONCLUSION

Microservices offer a pathway to get under-performing modernization initiatives on the right track. They offer a quicker yield to tangible results due to their highly modular design. By accelerating the development of high-performing, modern applications, microservices break the cyclical, large-scale modernization pattern experienced by most federal agencies every ten or so years. It can facilitate continuous and perpetual evolution of applications and permanently shield them from becoming obsolete and costly. Microservices deployed on modern serverless computing environments drastically enhance scalability and reliability while providing substantial cost avoidance. But achieving these results requires support from agile development teams with demonstrated experience in transforming monolithic applications into a microservices architecture.

Contact us at info@eglobaltech.com to find out how your organization can begin the
digital transformation to microservices!

 

Download White Paper Button

 

Copyright 2018 | eGlobalTech | All rights reserved.

Best Practices in Data Analytics

Data Analytics graphs and charts

BACKGROUND

Today the average agency within the Federal Government manages dozens, if not hundreds, of data sources and services that drive their mission functions. These data sources are of variable pedigree, quality, and efficacy and are frequently developed in silos presenting substantial challenges. The nature of the data that the Government leverages is becoming increasingly complex with the increased use of geospatial data, large sets of unstructured data as well as streaming data from sources such as Internet-of-Things (IoT) devices. Regardless of these challenges, Government agencies must seek new ways to exploit data to support their missions and improve the return on investment for taxpayers. Effective Data Analytics is crucial to extracting maximum value of information.

Data Analytics is the discipline of identifying, extracting, cleansing, transforming, mining, and visualizing data for valuable information that helps leadership make critical decisions. The effective employment of Data Analytics provides agencies with high-value data regarding their organization, their partners, and their stakeholders. It also helps reduce costs by decreasing the need to build potentially redundant systems that contain data which already exists in authoritative sources but may not have been identified.

eGLOBALTECH’S DATA ANALYTICS

Our Advanced Data Analytics Framework provides an extremely agile approach to discovering, analyzing, and leveraging data – including innovative approaches for data analytics, predictive analytics, and sentiment analysis. Our comprehensive framework focuses on the following best practices:

  • Goals and Metrics – Establishing goals prior to engaging in data analysis is essential to defining the right metrics and for keeping data analysis efforts on target. Similarly, knowing the target metrics keeps the team focused and helps avoid scope creep.
  • Data Pedigree – Understanding the pedigree of your data sources is essential to ensure you are accessing the most authoritative and reliable data possible.
  • Data Virtualization – Data Virtualization uses data in place rather than costly and error-prone file import/export. Using data virtualization wherever possible reduces the need for expensive and time-consuming data loading, thereby reducing costs and data latency.
  • Service Level Agreements (SLA) – Establishing SLAs is an essential element in data analytics to ensure external and partner-owned data sources are maintained with appropriate levels of availability, reliability, data latency, and quality.
  • Agile Analysis – Agile software engineering practices are ideal for data analytics because they promote early prototyping of data followed by increasing refinement. By presenting data to the users early they can help shape the discovery and analysis of additional data. The use of wireframes can drastically improve the efficacy and usefulness of visualizations.
  • Self Service – Effective data analytics solutions provide the right data services and visualization capabilities which enable users to derive answers on demand.
  • Microservices – Use of highly performant, rapidly developed microservices based on open standards such as Representational State Transfer (REST) and JavaScript Object Notation (JSON).
  • Security – Data security is essential for protecting organizational assets and privacy and reducing organizational vulnerability to hacking attempts.

Our leading-edge framework delivers the following key features:

  • Rapid delivery of initial data capabilities via our DevSecOps framework
  • Lowest possible data latency with ideal pedigrees
  • Accelerated delivery and deployment of data services
  • Extensive application of open data standards such as microservices and service bus technology
  • Expertise in advanced data analytics tools such as Cloudera Hadoop, Pentaho, and numerous open
    source visualization technologies
  • Cross-database platform support including Oracle, Microsoft SQL Server, IBM DB-2, Amazon RDS, and
    most leading platforms
  • Self-service Business Intelligence using leading-edge solutions including Tableau and QlikView
  • Advanced analytics using SAS, Cognos BI, Business Objects, and the R programming language

CONTACT US AT:
Info@eglobaltech.com if you would like more information on this topic!

 

 

Copyright 2018 | eGlobalTech | All rights reserved.

DevSecOps Drives Reliable and Secure Software

DevSecOps image for white paper

BACKGROUND

Software delivery in the Federal Government is transforming at the fastest pace since the advent of the Internet. With increasingly tight budgets and growing cybersecurity threats, the government must be able to deliver more with less—and do it more securely. However, many software development projects are plagued by fragmented teams that separate development, operations and maintenance, and security teams into silos. This results in numerous adverse effects that may include untimely delivery, defective code, and vulnerable code. Traditional waterfall methodologies feature a serial chain of phase gates that must be completed in a specific order. Requirements, design, development, and testing are executed as a chain of events, with documentation driving readiness reviews between each phase. In most cases, security scans are executed when software is promoted to production, with vulnerabilities potentially forcing rework and threats to a network.

With DevOps, developers create continuous delivery pipelines enabling them to build, deploy, and test software with every check-in. Code is unit tested, regression tested, deployed, and validated with each build. Highly customized deployment scripts give way to defining infrastructure as code, which massively accelerates application deployment from days to minutes. DevOps reduces technical debt due to deployment throughout development.

DevSecOps represents a refinement of DevOps, highlighting the security aspect. It breaks down barriers by providing an operations and security conscious software development paradigm that fuses development, operations, and security into a streamlined process. DevSecOps incorporates all aspects of security into every facet of development and deployment. Information assurance and cybersecurity activities are integrated into the agile development process, ensuring steady progression against certification and accreditation requirements. DevSecOps provides an integrated approach unifying teams, technologies, and processes for faster, more robust, and secure products.

DEVSECOPS EVOLVED: DEVSECOPS  CENTER OF EXCELLENCE AND EGT LABS®

eGlobalTech (eGT) established eGT Labs as a forward-leaning research and development environment to solve challenging client problems and to create high-value products and services. eGT Labs, in turn, launched the DevSecOps Center of Excellence to develop best practices and technical guidance on successful DevSecOps deployment. The DevSecOps CoE defined the following best practices as critical to deployment:

  • Establish the Culture – DevSecOps is not a singular process, nor is it a single lifecycle. It is an innovative approach to system engineering that focuses on teamwork, integration of cross-cutting concerns, and success through frequent repetition.
  • Coaching Is Key – Coaches should be heavily engaged during early adoption to help transform managers, engineers, and even contracting staff to fully understand DevSecOps and effectively interact with it.
  • Security-First Design – Security should be applied not only to the code, but also to the processes involved in coding. This provides an accelerant from project initiation that helps streamline deployment and delivery.
  • Automation – DevSecOps performed the eGT way features automation at all levels, including code generation, deployment, testing, and security testing to maximizing the impact of DevSecOps.
    Build Often/Deploy Often/Test Often – One of the key aspects of DevSecOps features daily builds and deployments with testing in every build. Products built with DevSecOps are better tested and more secure than products built without it.
  • DevSecOps Friendly Acquisition – DevSecOps thrives when teams are fully integrated and encouraged to collaborate. Enhancing acquisition strategies that favor integration of cross-functional teams is essential to reducing costs and developing better products.

GT Labs developed a framework which enables rapid delivery of secure solutions of superior quality by incorporating security and operations readiness from day one. Our leading, end-to-end framework and toolkit, DevOps Factory®, includes the following critical elements:

  • Implements security-first design and development.
  • Automates security governance and controls consistent with Ongoing Authorization (OA).
  • Secures the continuous delivery/continuous delivery pipeline through authentication, secure storage of build artifacts, key management, etc.
  • Automates security testing, static code analysis, configuration management, incident response and forensics, secure backups, log monitoring, and continuous monitoring and mitigation.
  • Incorporates compliance with FISMA, NIST, and other applicable federal standards and guidelines.

DEVSECOPS APPLIED

A public sector eGT client had a complex geospatial system prototype composed of Microsoft and open source applications with a growing number of ArcGIS services. This prototype was used in a production capacity and encountered frequent outages and performance issues.

To solve this issue, we applied DevOps Factory® to re-engineer the target architecture, implement security-first design, and automate the end-to-end cloud migration process onto our managed AWS infrastructure.

Results included:

  • Migrated and operationalized a secure geospatial cloud ecosystem to AWS within three months, compliant with federal security standards.
  • Securely on-boarded over a dozen complex applications and systems.
  • Seamlessly supported 400+% growth of geospatial services.
  • Achieved 99.99% operational availability.


CONTACT US AT:

Info@eglobaltech.com if you would like more information on this topic!

 

Copyright 2018 | eGlobalTech | All rights reserved.