If you work as a security officer in private industry or many government agencies, chances are you are responsible for multiple security domains, to include personnel security, industrial security, incident management, operations security, foreign disclosure, and insider threat for your entire organization. The current management tools for these areas are generally paper files, excel spreadsheets, SharePoint sites, email inboxes, and at best multiple stove-piped and antiquated software solutions. This leads to time lost searching for and providing information. The disconnection of systems has institutionalized inefficient processes that create additional work for each specialist handling a transaction. In a surprising large number of cases, inaccurate or absent data entry has led to missed handoffs, lengthening the clearance processing time, which hits an organization’s bottom-line unnecessarily. In the most heartbreaking cases, these organizations lose their ability to hold classified contracts completely due to non-compliance with the NISPOM (view pdf here) and failed inspections.
Evans & Chambers, in partnership Industrial Security Integrators (IsI), designed and developed Security Control to be the market leader in the industrial security software sector. Security Control is a cloud-native SaaS solution for industry and government that centralizes all aspects of industrial security to include modules for classified contracts, personnel, secure facilities, safes, classified materials, visits, incident management and insider threat. Our product notifies security officers and clearance holders of actions necessary to remedy issues to stay in compliance with security and agency policy directives. Rather than require security officers to hunt down employees for annual training via email and phone, Security Control assigns actions items to employees within the application. Users then click the email link to launch an employee portal to view their action items and complete them.
For incident management and insider threat, Security Control offers submission forms enabling employees to report their incident along with supporting details and attachments. The entire submission is then routed to the supervisor, security officer, insider threat personnel security officer (ITPSO), and any key personnel within the organization, as required by your organizational policies. Finally, a simple export is needed to provide the report to the relevant government authorities.
Security Control’s latest feature is automated DCSA self inspections. Prior to this breakthrough feature, organizations would spend up to a month emailing an MS Word questionnaire around to key corporate personnel, and approximately 15% of your cleared workforce. This process requires each recipient to answer a subset of questions. After herding cats to get the questions answered, and verifying accuracy and completeness, the data would be manually compiled into a coherent report with a cover letter and made available to DCSA with the hopes that it would fit compliance requirements. Security Control’s automated self inspection feature with workflows enables the security office to complete this entire process accurately in just a few clicks.
Security Control was built on the Amazon Web Services GovCloud regions with a security-first approach. Our product offers both SaaS and on-premise versions. The SaaS option ensures data isolation and integrity without requiring dedicated hardware. Our Multi-tenant architecture balances security with cost-effectiveness by segmenting customer information via unique databases per tenant while also employing auto-scaling through multiple web servers that share customer computational workloads. Private subnets are used to keep customer data secure and prevent any unauthorized access. Security Control is in the process of achieving a FedRAMP moderate compliance level.
Industrial Security Integrators, our strategic partner, came onboard as our first client in 2017. Together, we migrated 4,000 personnel clearances from over 200 Federal government contracting firms into the system. Today, our product has achieved a client base of 14,000 employee personnel clearances and over 550 Federal Government contractors. As Security Control continues to grow, the number of clerical errors and time lost during the clearance process will continue to decrease. We are committed to staying ahead of the compliance curve through the production of innovative new features as DCSA continues to add more regulations to the NISPOM. Through these efforts, we are excited to be recognized as a leading contributor in helping our clients achieve and maintain their levels of security, efficiency, and compliance.No Comments »
Machine learning and artificial intelligence (AI) services have become an integral part of many government-off-the-shelf (GOTS) and commercial-off-the-shelf applications (COTS). Did you know that there are convenient ways to train, manage, and deploy machine learning algorithms and services?
On July 23rd, Dave Rabrun, one of Evans & Chambers’ lead software developers, held a Tech Talk for the company exploring big data, open-source processing, and mitigation challenges presented by creating custom machine learning tools within the public sector. Furthermore, Dave discussed the advantages of Amazon SageMaker, a cloud-based service that supports the entire machine learning workflow. With custom data processing tools becoming increasingly necessary for public sector clients, it is important to stay on top of the relevant technologies that makes those necessities a reality.
The presentation provided the audience with a deep understanding of why DoD and Intelligence Community agencies need to build custom image classification, video recognition, natural language processing and sentiment analysis tools. The presentation also provided a live demonstration of training an interactive data model.
About the EC Tech Talk Series
The Tech Talk Series is an employee-led platform dedicated to EC’s core value of continual learning. These talks aim to cover a broad range of technology-based topics to promote the sharing of best practices and ideas across EC’s project teams.No Comments »
Do you think you’d be able to explain User Experience (UX) Design and how it fits into the application development cycle? You may be interested to know that UX Design covers the entire cycle, including aspects of branding, design, usability, and function.
On Tuesday June 25th, Autumn Richardson, one of Evans & Chambers’ User Experience Designers on the DID(it) team at USCIS, held a Tech Talk for the company, explaining what user experience design is and what is considered to be a well-designed application.
In addition, Autumn explored how to incorporate users’ needs into each feature to improve workflow and used examples from an application she is currently designing that helps asylum officers maximize their time to reduce interview backlogs. In a busy and fast-paced world, efficiency is more important than ever. UX Design allows us to create products with meaningful and relevant experiences to users that help them concentrate more on what is necessary to fulfill their goals for the day.
About the EC Tech Talk Series
The Tech Talk Series is an employee-led platform dedicated to EC’s core value of continual learning. These talks aim to cover a broad range of technology-based topics to promote the sharing of best practices and ideas across EC’s project teams.
No Comments »
On April 30th, 2016, the fifth annual Broccoli City Festival was held at the Gateway DC in SE Washington, DC. This is the first year that Evans & Chambers has had the privilege to partner with the Broccoli City Festival, a 501(c)3 non-profit organization that works to highlight accessible ways that people can live healthier lifestyles in an environmentally sustainable way. As the organizers of the festival explain it, Broccoli City Festival is more than just a music festival, but it is a unique experience where pop culture, health and environmental sustainability are celebrated together.
This year Evans & Chambers was able to further Broccoli City’s mission of attendee engagement by developing and deploying the official Broccoli City Festival app which was available as a free download for iOS from the Apple App Store. The app was able to add an additional layer of interaction to the festival as festival goers now had the ability to receive exclusive information on the performing musical artists, locate food trucks and vendors, view a unified social media feed that aggregate fan posts to Twitter, Instagram and Soundcloud via the #BCFEST hashtag, and many more features.
When Evans & Chambers was approached to create this app exclusively for the festival, we proposed a multi-phased approach to delivering the Broccoli City Festival app in time for the April 2016 festival with additional features to be released during the rest of the 2016 calendar year. The codebase for the app was developed as an iOS-only application on the Xamarin Studio (recently acquired by Microsoft) with a custom API back-end coded in ASP.NET and hosted the Microsoft Azure cloud network.
During the days leading up to the festival and on the day of the festival, the app received a good amount of traffic from users. As of May 2, 2016 the app received 2,018 downloads, with 594 downloads occurring on the day of the festival. As of May 3, 2016 the app also received 3,014 views, with 1,009 views occurring on the day of the festival. With over 13,000 people in attendance at the festival, these numbers account for almost 10% of the entire crowd.No Comments »
More and more Government agencies and commercial clients are finding ways to dramatically cut expenses by migrating mission-critical applications to lower cost hosting environments. Cloud solutions an effective choice, and EC’s solutions architects are helping our clients get there!
You may be familiar with the term, but cloud computing leverages economies of scale and efficiency by offering distributed infrastructure hosting. Amazon Web Services (AWS) is one of the largest and most innovative providers of cloud IT solutions. With AWS, organizations can pay by the hour for computing resources, software and services, purchasing only what they need and eliminating up front costs and recurring costs for physical servers, operating system software, server room space, and hardware maintenance. There are no minimums and no contracts, so organizations can be as flexible as they want to be: some choose to run applications only during core business hours, shutting down the application during times it isn’t needed. This model also offers the ability to build temporary prototypes at little expense. Amazon Web Services scale automatically as demand grows. This simply isn’t an option when purchasing servers, for an organization would typically purchase enough servers for production service to handle the anticipated maximum load. AWS scales horizontally (adding more servers) or vertically (adding more horsepower to existing servers) based on demand. When demand subsides, the servers are dropped back down to baseline power.
EC uses Amazon Web Services on several projects:
- Web-based solutions for private companies that help them streamline processes and increase response time
- A big data product that aggregates and semantically analyzes complex data in real-time
- As part of our Government solutions to help clients move systems from costly hardware to cloud solutions
Our clients are finding that by saving resources on IT infrastructure, they can redirect investments into critical areas.
Java was developed by Sun Microsystems in the early 1990’s as a system for professional, skilled programmers to develop application software.
There were other implementations of the same or similar language, however. Microsoft was developing Internet Explorer as a competitor to Netscape Navigator, and they dubbed their dialect of the language “JScript” in order to avoid trademark issues.
Bootstrap was developed by Mark Otto and Jacob Thornton at Twitter as a framework to encourage consistency across internal tools. According to the Twitter Developer’s Blog:
In the earlier days of Twitter, engineers used almost any library they were familiar with to meet front-end requirements. Inconsistencies among the individual applications made it difficult to scale and maintain them. Bootstrap began as an answer to these challenges and quickly accelerated during Twitter’s first Hackweek.
Here’s what a couple of our own developers had to say about Bootstrap:
“I’m having some issues between some of the css files overriding each other (in Firefox, bootstrap loads first, but in IE bootstrap-responsive loads first) and they have conflicting values, but overall it seems really nice.”
“I found it easy to integrate, and a lot of fun to use. Highly recommended as a starting point for application development.”
Countless commercial organizations are using Bootstrap to create their sites. Even government agencies have adapted it too: NASA used it to create code.NASA, the agency’s forum for discussing open source creations and the National Geospatial-Intelligence Agency used Bootstrap to build its unclassified app store.No Comments »
It’s an exciting time for Government data! With the release of Steven VanRoekel’s Digital Government Strategy, government data is about to become much more accessible.
The purpose of the strategy is to encourage agencies to provide access to government data, information, and services on multiple devices for use by the public, entrepreneurs, and other agencies; and for agencies to procure and manage devices, applications, and data in smart and affordable ways.
Developers and entrepreneurs use government data to spur innovation by using various data sources to develop useful applications, such as the Swiftwater Calculator, which helps First Responders locate victims of swift water incidents during a rescue.
VanRoekel’s vision is that the new digital approach utilizes a shared platform approach that is information- and customer-centric while maintaining appropriate security and privacy.
With guidance from OMB, the Federal CIO Council, and advisory groups, Agencies are challenged to meet the following goals throughout the next year:
Within one month:
✓ Establish an agency-wide governance structure for developing and delivering digital services.
Within three months:
✓ Engage with customers to identify at least two existing major customer-facing services that contain high-value data or content as first-move candidates to make compliant with new open data, content, and web API policy.
✓ Engage with customers to identify at least two existing priority customer-facing services to optimize for mobile use.
Within six months:
✓ Develop an enterprise-wide inventory of mobile devices and wireless service contracts.
✓ Implement performance and customer satisfaction measuring tools on all .gov websites.
Within one year:
✓ Ensure all new IT systems follow the open data, content, and web API policy; and operationalize agency.gov/developer pages
✓ Make high-value content and data in at least two existing major customer-facing systems available through web APIs, apply metadata tagging, and publish a plan to transition additional high-value systems.
✓ Evaluate the government-wide contract vehicles in the alternatives analysis for all new mobile-related procurements.
✓ Ensure all new digital services follow digital services and customer experience improvement guidelines.
✓ Optimize at least two existing priority customer-facing services for mobile use and publish a plan for improving additional existing services.
Read the Digital Government Strategy Report in its entirety here.2 Comments »
With the recent news that CIA drones had targeted al Qaeda’s second-in-command, I decided to do a little investigative research into what makes the Predator drone tick (and I rightly decided that the drone isn’t something you’d want to pick on!). Here are a couple facts I found interesting:
✓ Drones (or more formally known as Unmanned Aerial Vehicles, or UAVs) run on the same type of engines used to power snowmobiles
✓ It takes a team of 82 personnel to successfully run a fully operational mission
✓ UAVs can be programmed to automatically return home if the data link is lost
✓ The Predator drone (the type used to target al Qaeda’s Abu Yahya al Libi) carries a Hellfire missile, which weighs 500 pounds, and, when detonated, leaves a 15-foot-wide crater
✓ The Predator drone is run by a computerized Tactical Control System (TCS), which was developed by Raytheon in 1999, and was the first to conform to NATO standards
✓ The Tactical Control System that controls the Drone is run by a computer using a type of Unix-based operating system
Check out a more extensive article that focuses on how drones work and the technologies that run them: Share411 – Drone TechnologyNo Comments »
Recently, the Department of Homeland Security was the subject of a Homeland Security Subcommittee on Counterterrorism and Intelligence hearing entitled, “DHS Monitoring of Social Networking and Media: Enhancing Intelligence Gathering and Ensuring Privacy.” The discussion centered around the agency’s monitoring of popular social media sites for counterterrorism purposes. Listening to the issues at the hearing, I got to wondering: with Twitter alone averaging 2,200 tweets per second, how can one possibly search and analyze everything posted on social media sites? The answer: Apache Hadoop.
When agencies and companies need to mine data sets that are too large to be analyzed by desktop analytical tools, they turn to Hadoop. The open source software scales up for large projects to allow for the distributed processing of extremely large data sets. The work is “divided into many small fragments of work, each which may be executed or re-executed on any node in the cluster.” You can visualize this by imagining the way a search and rescue party canvasses large areas of land – by dividing the mapped terrain into subsections, amassing a team of volunteers, and assigning sections to each individual. In an example of searching 1 million Tweets for the word “drugs,” Hadoop begins with a master server, which parses pieces of the problem (a command to search 100,000 Tweets, say) to sub-servers. Those sub-servers parse the problem further, perhaps by commanding two of its own sub-servers to search 50,000 Tweets. This division of labor can continue, depending upon the size of the problem.
These servers conduct the search, sending back the number of Tweets containing the word “drugs.” The mid-line servers collect results from each of their sub-servers, and pass the information to the master server. In this way, the problem of searching 1 million Tweets is divided among several servers that can perform complex searches faster than one server alone.
So fast, in fact, that in 2008, Hadoop reported that “one of Yahoo’s Hadoop clusters sorted 1 terabyte of data in 209 seconds, which beat the previous record of 297 seconds in the annual general purpose terabyte sort benchmark.”No Comments »