Blog

Evans & Chambers Increases Engagement at Broccoli City Festival via iOS App

by Cedric Craig May 11th, 2016 Business Mobile Software Technology

On April 30th, 2016, the fifth annual Broccoli City Festival was held at the Gateway DC in SE Washington, DC. This is the first year that Evans & Chambers has had the privilege to partner with the Broccoli City Festival, a 501(c)3 non-profit organization that works to highlight accessible ways that people can live healthier lifestyles in an environmentally sustainable way. As the organizers of the festival explain it, Broccoli City Festival is more than just a music festival, but it is a unique experience where pop culture, health and environmental sustainability are celebrated together.

IMG_2058 This year Evans & Chambers was able to further Broccoli City’s mission of attendee engagement by developing and deploying the official Broccoli City Festival app which was available as a free download for iOS from the Apple App Store. The app was able to add an additional layer of interaction to the festival as festival goers now had the ability to receive exclusive information on the performing musical artists, locate food trucks and vendors, view a unified social media feed that aggregate fan posts to Twitter, Instagram and Soundcloud via the #BCFEST hashtag, and many more features.

When Evans & Chambers was approached to create this app exclusively for the festival, we proposed a multi-phased approach to delivering the Broccoli City Festival app in time for the April 2016 festival with additional features to be released during the rest of the 2016 calendar year. The codebase for the app was developed as an iOS-only application on the Xamarin Studio (recently acquired by Microsoft) with a custom API back-end coded in ASP.NET and hosted the Microsoft Azure cloud network.

During the days leading up to the festival and on the day of the festival, the app received a good amount of traffic from users. As of May 2, 2016 the app received 2,018 downloads, with 594 downloads occurring on the day of the festival. As of May 3, 2016 the app also received 3,014 views, with 1,009 views occurring on the day of the festival. With over 13,000 people in attendance at the festival, these numbers account for almost 10% of the entire crowd.

No Comments »

Cost Savings with Cloud Computing

by Jamil Evans Apr 16th, 2014 Technology

More and more Government agencies and commercial clients are finding ways to dramatically cut expenses by migrating mission-critical applications to lower cost hosting environments. Cloud solutions an effective choice, and EC’s solutions architects are helping our clients get there!

You may be familiar with the term, but cloud computing leverages economies of scale and efficiency by offering distributed infrastructure hosting. Amazon Web Services (AWS) is one of the largest and most innovative providers of cloud IT solutions. With AWS, organizations can pay by the hour for computing resources, software and services, purchasing only what they need and eliminating up front costs and recurring costs for physical servers, operating system software, server room space, and hardware maintenance. There are no minimums and no contracts, so organizations can be as flexible as they want to be: some choose to run applications only during core business hours, shutting down the application during times it isn’t needed. This model also offers the ability to build temporary prototypes at little expense. Amazon Web Services scale automatically as demand grows. This simply isn’t an option when purchasing servers, for an organization would typically purchase enough servers for production service to handle the anticipated maximum load. AWS scales horizontally (adding more servers) or vertically (adding more horsepower to existing servers) based on demand. When demand subsides, the servers are dropped back down to baseline power.

EC uses Amazon Web Services on several projects:

  • Web-based solutions for private companies that help them streamline processes and increase response time
  • A big data product that aggregates and semantically analyzes complex data in real-time
  • As part of our Government solutions to help clients move systems from costly hardware to cloud solutions

Our clients are finding that by saving resources on IT infrastructure, they can redirect investments into critical areas.

1 Comment »

What’s in a Name?

by Jamil Evans Jan 24th, 2013 Technology

think javaSoftware and programming languages have the most intriguing names: Ruby on Rails, Lombardi, Perl, Python, Hadoop. In an industry that grows almost exponentially with new iterations of existing software and languages, creative naming conventions are almost required.  The naming of Java and JavaScript has left me confused: they aren’t the same thing by any means, but what is the relationship? Apple Computer and Apple Corps have an ongoing, 30-year trademark debate over the use of “Apple,” so how do “Java” and “JavaScript” coexist peacefully?

Many believe that JavaScript was a marketing ploy by Netscape to piggyback on Java’s success, but the historical connection indicates an early partnership and a number of trademark (dis)agreements.

Java was developed by Sun Microsystems in the early 1990’s as a system for professional, skilled programmers to develop application software.

JavaScript was later developed at Netscape by Brendan Eich under the name “Mocha.” The idea behind JavaScript was that web designers and novice programmers could write bits of code to enhance user experience with animation or simple commands. Eich viewed Mocha as “a little brother to Java, as a complementary language like Visual  Basic was to C++ in Microsoft’s language families at the time.”

To position the language in the growing marketplace, Sun and Netscape agreed to a licensing arrangement in December 1995 and agreed to market JavaScript as a complementary scripting language to Java.

There were other implementations of the same or similar language, however. Microsoft was developing Internet Explorer as a competitor to Netscape Navigator, and they dubbed their dialect of the language “JScript” in order to avoid trademark issues.

When Netscape delivered JavaScript to EMCA International for codification in 1996, ECMAScript later became the standardized, “official” version of the language. The name “ECMAScript” was a compromise between the organizations involved in the standardization, especially Netscape and Microsoft. But, Eich later commented that ECMAScript “sounds a little like a skin disease. Nobody really wants it.” It certainly seems that way, since today’s ECMAScript is better known by the trade names JavaScript, JScript, or ActionScript.

No Comments »

Web Design with Twitter Bootstrap

by Jamil Evans Oct 11th, 2012 Technology

Although initially released in August 2011, Twitter Bootstrap has gotten a lot of attention from EC developers lately for providing the framework for clean, responsive web and application design that is compatible with all major browsers. With just some working knowledge of HTML and CSS, anyone can create a responsive site with stylish typography, forms, buttons, navigation, labels, progress bars, and other elements. Bootstrap provides information, examples, and code snippets, but the basic download includes HTML elements stylized and enhanced with extensible classes, responsive grids, reusable components, JavaScript elements, and custom jQuery plugins.

Bootstrap was developed by Mark Otto and Jacob Thornton at Twitter as a framework to encourage consistency across internal tools. According to the Twitter Developer’s Blog:

In the earlier days of Twitter, engineers used almost any library they were familiar with to meet front-end requirements. Inconsistencies among the individual applications made it difficult to scale and maintain them. Bootstrap began as an answer to these challenges and quickly accelerated during Twitter’s first Hackweek.

Here’s what a couple of our own developers had to say about Bootstrap:

“I’m having some issues between some of the css files overriding each other (in Firefox, bootstrap loads first, but in IE bootstrap-responsive loads first) and they have conflicting values, but overall it seems really nice.”

“I found it easy to integrate, and a lot of fun to use.  Highly recommended as a starting point for application development.”

Countless commercial organizations are using Bootstrap to create their sites. Even government agencies have adapted it too: NASA used it to create code.NASA, the agency’s forum for discussing open source creations and the National Geospatial-Intelligence Agency used Bootstrap to build its unclassified app store.

No Comments »

More Accessible Government Data

by Jamil Evans Aug 8th, 2012 Technology

Government data

It’s an exciting time for Government data! With the release of Steven VanRoekel’s Digital Government Strategy, government data is about to become much more accessible.

The purpose of the strategy is to encourage agencies to provide access to government data, information, and services on multiple devices for use by the public, entrepreneurs, and other agencies; and for agencies to procure and manage devices, applications, and data in smart and affordable ways.

Developers and entrepreneurs use government data to spur innovation by using various data sources to develop useful applications, such as the Swiftwater Calculator, which helps First Responders locate victims of swift water incidents during a rescue.

VanRoekel’s vision is that the new digital approach utilizes a shared platform approach that is information- and customer-centric while maintaining appropriate security and privacy.

With guidance from OMB, the Federal CIO Council, and advisory groups, Agencies are challenged to meet the following goals throughout the next year:

Within one month:

Establish an agency-wide governance structure for developing and delivering digital services.

Within three months:

Engage with customers to identify at least two existing major customer-facing services that contain high-value data or content as first-move candidates to make compliant with new open data, content, and web API policy.

Engage with customers to identify at least two existing priority customer-facing services to optimize for mobile use.

Within six months:

Develop an enterprise-wide inventory of mobile devices and wireless service contracts.

 Implement performance and customer satisfaction measuring tools on all .gov websites.

Within one year:

Ensure all new IT systems follow the open data, content, and web API policy; and operationalize agency.gov/developer pages

 Make high-value content and data in at least two existing major customer-facing systems available through web APIs, apply metadata tagging, and publish a plan to transition additional high-value systems.

Evaluate the government-wide contract vehicles in the alternatives analysis for all new mobile-related procurements.

Ensure all new digital services follow digital services and customer experience improvement guidelines.

Optimize at least two existing priority customer-facing services for mobile use and publish a plan for improving additional existing services.

Read the Digital Government Strategy Report in its entirety here.

2 Comments »

Drone On

by Jamil Evans Jun 15th, 2012 Technology

 

Predator Drone

With the recent news that CIA drones had targeted al Qaeda’s second-in-command, I decided to do a little investigative research into what makes the Predator drone tick (and I rightly decided that the drone isn’t something you’d want to pick on!). Here are a couple facts I found interesting:

Drones (or more formally known as Unmanned Aerial Vehicles, or UAVs) run on the same type of engines used to power snowmobiles

It takes a team of 82 personnel to successfully run a fully operational mission

UAVs can be programmed to automatically return home if the data link is lost

The Predator drone (the type used to target al Qaeda’s Abu Yahya al Libi) carries a Hellfire missile, which weighs 500 pounds, and, when detonated, leaves a 15-foot-wide crater

The Predator drone is run by a computerized Tactical Control System (TCS), which was developed by Raytheon in 1999, and was the first to conform to NATO standards

 The Tactical Control System that controls the Drone is run by a computer using a type of Unix-based operating system

Check out a more extensive article that focuses on how drones work and the technologies that run them: Share411 – Drone Technology

No Comments »

The Elephant in the Room: Hadoop

by Jamil Evans Apr 5th, 2012 Technology

Recently, the Department of Homeland Security was the subject of a Homeland Security Subcommittee on Counterterrorism and Intelligence hearing entitled, “DHS Monitoring of Social Networking and Media: Enhancing Intelligence Gathering and Ensuring Privacy.” The discussion centered around the agency’s monitoring of popular social media sites for counterterrorism purposes. Listening to the issues at the hearing, I got to wondering: with Twitter alone averaging 2,200 tweets per second, how can one possibly search and analyze everything posted on social media sites? The answer: Apache Hadoop.

Apache Hadoop

When agencies and companies need to mine data sets that are too large to be analyzed by desktop analytical tools, they turn to Hadoop. The open source software scales up for large projects to allow for the distributed processing of extremely large data sets. The work is “divided into many small fragments of work, each which may be executed or re-executed on any node in the cluster.” You can visualize this by imagining the way a search and rescue party canvasses large areas of land – by dividing the mapped terrain into subsections, amassing a team of volunteers, and assigning sections to each individual. In an example of searching 1 million Tweets for the word “drugs,” Hadoop begins with a master server, which parses pieces of the problem (a command to search 100,000 Tweets, say) to sub-servers. Those sub-servers parse the problem further, perhaps by commanding two of its own sub-servers to search 50,000 Tweets. This division of labor can continue, depending upon the size of the problem.

These servers conduct the search, sending back the number of Tweets containing the word “drugs.” The mid-line servers collect results from each of their sub-servers, and pass the information to the master server. In this way, the problem of searching 1 million Tweets is divided among several servers that can perform complex searches faster than one server alone.

So fast, in fact, that in 2008, Hadoop reported that “one of Yahoo’s Hadoop clusters sorted 1 terabyte of data in 209 seconds, which beat the previous record of 297 seconds in the annual general purpose terabyte sort benchmark.”

No Comments »

Cloud Computing 101

by Jamil Evans Mar 1st, 2010 Technology

So, you’ve heard or read about cloud computing, but exactly what does this term mean? Here, we’ll explain the concept, and compare it to a process you may already be familiar with. We’ll also discuss some pros and cons to cloud computing.

“Cloud computing” refers to the way data, software, and applications are managed, organized, and accessed. It is a mechanism for the delivery of services. Cloud computing service providers provide space for customers, including individuals, businesses, and governments, to store information, software, and applications off-site. No standard definition exists, but The National Institute of Standards and Technology lists the characteristics of cloud computing as “on-demand self-service, broad network access (internet standards based), location independent resource pooling, rapid elasticity, and measured service.” As an illustration of the concept of cloud computing, think of how your home computer houses your software and data including important documents and family photos. When thinking about backing up your data, you might consider a virtual backup service such as Mozy or Carbonite. These services maintain your data on their servers, which you can access over the Internet, from any computer. The virtual backup service might represent the experience you have when accessing data on a cloud: your data is stored on another server, and you are able access and retrieve your data at any time through an Internet site. You may have already worked off a cloud before without realizing it. Google Docs, Facebook, and Skype, are examples of applications that are hosted using cloud technology. As the technology advances, it’s possible that cloud computing may even eliminate your need for a hard drive in the future.

Cloud computing may prove most valuable for companies and governments. Currently, companies purchase huge amounts of hard drive space and servers to store volumes of data. In addition, companies must have enough computing power to host public websites and private intranets. Because of the size of space needed to accommodate additional users or an increase in users at a particular time, companies must invest in additional space than that which is actually used at any one point in time. Organizations, including government agencies like GSANASA and Navy, are working to evaluate the cloud’s usefulness for decreasing costs by reducing the need for hardware, software, IT personnel, physical space, and maintenance. In cloud computing, a company can also take advantage of the processing power of the cloud for increases in website traffic or large calculations. In this way, additional resources can be purchased as needed, rather than in the traditional model where maximum space is purchased, but not necessarily consumed. As with any technology, there are possible drawbacks, and with cloud computing: it comes down to privacy and security. Once data storage and management is transferred to an outside party, the owner loses a certain amount of control over that data. In addition, the owner can no longer take full responsibility for the security of the data and may experience increased opportunities for a security breach. To state it simply, since the owner can access it’s data from any location, it’s possible that others can, too.

Now that you’ve got a solid understanding of what’s at stake, what do you think? How would you feel about cloud computing solutions if you were the company or government agency? Share your thoughts or questions in the comments section below.

No Comments »

What’s so Web 2.0 about this site?

by Jamil Evans Feb 7th, 2008 Technology

The term Web 2.0 describes new trends in the design and development that started appearing across an increasing number of web successful web sites several years ago. Although some believe the term is simply a marketing buzzword wrapped around existing technologies, the reality is that Web 2.0 can solve some of the gigantic problems that our Government customers face. Tim O’Reilly, who originated the term in 2003, asserts in his article ‘What Is Web 2.0′ that Web 2.0 marks a turning point in the web just after the dot-com collapse. The companies that survived the collapse seemed to have some things in common. An analysis of those commonalities brings us to a better understanding of what Web 2.0 really means.

While I would not consider it strictly Web 2.0, lets analyze the Web 2.0 qualities of this web site:

  1. Usage of contemporary graphic design techniques. Effects such as gradients, soft shadows, rounded corners, and simplified layouts. A clean interface removes the fluff and places more importance on the data within the site, which is important since the data in a true Web 2.0 site is constantly and unpredictably changing.
  2. Big fonts in headers. This is a return to HTML’s origins when the size of the text indicated it’s relative importance on the page.
  3. Use of Open Source software. The new web is all about sharing, and so is Open Source. We utilize PHP, MySQL, WordPress, and Eclipse in the development of this site.
  4. A blog. Any true blog brings other associated Web 2.0 ideals including an RSS feed, usage of tagging (“folksonomy”), a tagcloud, widgets, and user comments.
  5. CSS layouts instead of HTML table-based where possible. Saves on bandwidth, maintainability of the site, and cross-platform accessibility (laptop, iPhone, high-tech refridgerator).
  6. XHTML and CSS passes validation. This improves likelyhood that the site will work across browser versions and platforms.  The (X) in XHTML stands for “extensible”. By using XHTML strict, you are ensuring your pages will display correctly in a broad variety of devices for years to come … a very important concept for large, portable and long-sustaining web projects.
  7. Web accessibility. The entire site provides for screen readers for the vision impaired, high contrast for the color blind, and doesn’t require specific knowledge or abilities to utilize.

Other aspects of a Web 2.0 site that are not incorporated into this site (yet):

  1. Use of Ajax. This is an integral part of anything Web 2.0. It allows for page updates without needed to refresh the screen, which makes for a user experience previously uknown on the web, see Meebo.com or Netflix for examples.
  2. The Wiki. The ultimate manifestation of user generated content, the wiki enables collaboration, which is another hallmark of Web 2.o.
  3. Web services. These allow for the newly popular mashups and REST/XML/JSON based APIs allowing interaction between sites and more ways for the user to take control.
No Comments »