Tech Talk: NBAA

Recently, I began preparing a session for the NBAA conference in Orlando targeted at the aviation industry. I struggled identifying the needs of the audience since they were a bit different than I usually have the opportunity to speak to. The following article was not the presentation I gave, but an early direction to introduce several technology concepts to the group and help them understand how these topics could improve their business processes and general operations. The primary topics included:

  • Electronic Data
  • Specialized Software (and SaaS)
  • The Cloud
  • Mobile
  • Security Tips

Though I ultimately presented a slightly different direction for this material, I think it still has some value to be presented here.

Migrating from paper to electronic systems is a challenge. Technically, the data needs to be structured in a way that computer systems understand, but the real challenge comes from user adoption. Paper let’s you write anything you want, change workflow however you need at that moment, and is comfortable to some workforces that are still not yet at ease with computers.

However, the move to electronic data allows for real time validation of the data to significantly reduce mistakes, gives us visibility into trends that may be occurring, and allows us to access the information by several people at a time.

Take reporting aviation discrepancies for example; at Flightdocs, we have seen a number of operators switch from a paper based write-up to electronic. Images and video can be captured and attached to the discrepancy for later evaluation and over time, we can begin to track trends in part failure or unexpected use.

As you begin to move your data to electronic systems you may be tempted to move to Excel or other similar general use software. This is a great start, but it has many drawbacks. Data is typically not validated and doesn’t reflect the real use of the data such as proper tolerances or required fields.

Collaboration becomes a real issue as emailing files back and forth is fraught with errors and typically if the file exists on a network it can only be in use by one person at a time.

Look for specific software that solves an important problem for you. Whether it is maintenance, flight scheduling, inventory, or accounting — find an expert company that can help you tackle these problems in a purpose driven way.

In most cases, I advise against buying on-premise software if possible. This is software that you have to install and maintain at your company and comes with all kinds of hidden costs and complexity. At Flightdocs, we both use software as a service as customers, and providers as our business. This is software that is hosted by someone else, often in a cloud, and is accessed by the internet for a monthly or yearly fee.

The cloud, at it’s simplest form is a way of renting servers from another company with quite a bit of magic thrown in to handle massive scaling. However, this over simplification shouldn’t belittle how important this shift in technology is and all of the tremendous opportunities it now gives us.

In the past, it was incredibly difficult to scale quickly and across continents. It meant purchasing servers, setting up data centers, staffing the appropriate IT resources to manage the hardware and software, and keeping everything up to date and running smoothly.

The cloud allows all of that scale and complexity to disappear and you to simply use or develop applications and has led to more innovation in mobile device software and internet enabled embedded devices.

Now that you’ve moved from paper to electronic, selected the right targeted software for your operations, and have access to that data anywhere through the internet, look to mobile for access away from your desktop.

Imagine each leg of your flight updating compliance metrics and aircraft times in real time to help keep your due list in check or notify home base of necessary maintenance or inventory orders.

You could even dispatch work to individuals who can follow up with their mobile devices and keep getting the latest information throughout the day.

Now that we’ve built up this discussion with all of the good things you can do with technology, let’s share a couple of important drawbacks.

When you go to software as a service or to the cloud, you intentionally give up a lot of responsibility. This can also work against you in that you may not have as much control if there is an issue. In computers, we all know that things aren’t perfect. Outages do happen, hardware does fail, and mistakes are made. If you are already outfitted with the best experts in supporting a production quality network and application, then it may not make sense for you to give over this control.

Security is a double edged topic. If the data that you are storing is of significantly high security, such as weapons systems, or medical patient information, then you may want to reconsider a cloud provider. This is not to say that a cloud provider is going to be necessarily less secure, but you have less direct oversight and therefore are unable to answer some specific security requirements for certain certifications.

  • Home Depot — 56m credit cards potentially stolen through installed malware on cash register machines.
  • JP Morgan — Month long attack stealing 76m names, email addresses, addresses, and phone numbers of account holders.
  • Ebay — 145m user accounts potentially compromised by hackers stealing employee accounts.
  • Adobe — 152m credentials accessed and sensitive information erased.
  • Target — 70m records stolen from compromised magnetic strips on card readers.

When you opt to start moving more and more data to the cloud, you’re making your information more accessible. This is a good thing, but it needs to be controlled for the right people. There are several steps that you can take to further protect yourself and your data.

Here are a few helpful tips:

  • Always use a strong password. These are passwords that can be harder to remember but provide much better security.
  • Never use the same password on more than one system. By enforcing this, you limit your exposure if by some chance a password is compromised.
  • Always ensure you are connecting over secure traffic, look for sites that show a lock in the address bar.
  • Ask for and setup multi-factor authentication to help protect you, even if your password is stolen. Multi-factor authentication also called 2-factor authentication is when you have a username and password and a second factor such as your phone to confirm login attempts.
  • When using a software as a service company, ensure your password is hashed when stored. Flightdocs uses one way hashing to prevent decryption attacks.
  • Ask about encrypted data practices when moving data to the cloud. Not all data needs to be encrypted, but data that you consider sensitive should be.
  • Keep computers and browsers up to date with the latest patches.
  • Install virus and malware scanning software on your computer to help prevent attacks.
  • Always set a pin or login for your mobile phone. Phones are easy to steal and provide lots of information as we adopt mobile access strategies.
  • Be careful with roaming settings on your phone due to wireless hijacking.
  • Backup your data, but also be careful to encrypt and protect backups as they can become vulnerable sources of data.

Catching up to MVC

Contrary to my nature, I’ve been reluctant to adopt the “latest and greatest” from Microsoft for the past twelve months or so. A good deal of my tech lag time has been due to my primary position leading an Oracle Enterprise Business Suite (EBS) project which has put me in the world of red, not blue. It’s been the unfamiliar – Oracle DB, Jdeveloper, and putty – that I’ve been working in more than anything made in Redmond lately. However, there is an equally valid reason – I’ve lost a bit of my Microsoft faith over the countless hours of podcasts, reading articles, and attending events which had left me completely unsure what Microsoft was doing (and questioning whether Microsoft knew as well). For quite a while it felt as though everything Microsoft created was tossed into the eco-system somewhat half-baked and what received the most buzz stuck. Perhaps this is Microsoft adopting some of the open source mentality that has been so hard on Microsoft in the past; perhaps it was simply to create a competitive nature internally at Microsoft; but to some critics, and even a handful of die-hard developers like myself, it seemed scattered and without direction.

There’s a bit more of a problem that I now understand more fully than when I was younger; time is not an infinite resource. Before I would jump whole-heartedly into a new technology learning with vigorous disregard of whether that technology would rise to the top or fall by the wayside. The cost was minimal; it only meant time away from things I probably shouldn’t be doing anyway.

Now that I have a daughter (as well as another on the way) and other family demands, I have to choose very carefully what my training time can go to. Should the focus be MVC, WP7, Silverlight, BizTalk (or gasp, Oracle SOA Suite), Entity Framework, jQuery, HTML5/CSS3, Azure, Denali, Kinect SDK, or any of the now vast array of Microsoft offerings available to developers?

So, as I’ve found myself occupied in other technologies, I rode the fence to see what shook out. It looks like Microsoft is finding their groove again and it’s time to brush aside the fallout and introduce the winner(s).

Welcome MVC to My Toolbox

As most of you know, MVC as a pattern has been around for quite a while. It is this reason that I originally waited to see if Microsoft’s implementation would survive or if the community would look to past MVC frameworks and decide to go back to something from the open source community or an established Java MVC implementation that would be ported over.

Obviously, that does not appear to be the case and Microsoft’s MVC implementation has been a huge success.

So now it’s time to get to work. I’ll be periodically posting about my learning process with MVC if any other developers are interested. There is a huge list of resources now available for MVC and as I post (or you comment), I will try and steer developers to what I consider the most useful.

Disclaimer and decisions made about the example material

Please note that there is still a plethora of examples showing simple MVC apps which I am intentionally choosing to not use in these blog posts. I am starting with a quite large project to accurately compare MVC to “the real world of enterprise applications”. I’ve also decided to employ logic and models via a service layer instead of directly in the MVC project. P.S. – way to go MVC for allowing this, it should be played up more in examples!

Also, in case you didn’t pick up on this earlier, but this is a learning journey for me with MVC – I am by no means an expert… yet.

Without further ado, here are my notes from the first foray into MVC with an enterprise application.

Project Orange (the anti-apple)

It’s not important what this project is, but it is intended to have multiple tiers where the web tier will be MVC. Let’s look at the setup of the projects:

Project Orange Projects

BusinessDataLogic is a combined BLL and DAL using entity framework.
Models are the business entities and DTO’s.
Services are WCF services which wrap the business logic and expose DTO’s.
Utilities are just helper classes that are common throughout the tiers (extension methods, etc.)
Web is MVC.

MVC Presentation Tier

Let’s take a closer look at the web application and the asset structure:

Project Orange MVC

As you can see, I’ve added a service reference for my WCF Inventory service. I’ve also added an ItemController and a corresponding View directory for Item.

I’ve added one new method within the ItemController for searching:

//
// GET: /Item/Search/had

public ActionResult Search(string text)
{
   var client = new InventoryService.InventoryClient();
   var items = client.SearchItems(1, 1, text);
   return View(items);
}

I’ve created a Search view which was auto-scaffolded based on the strongly typed model. The scaffolding is a nice feature and I can see this being extensible in the future.

Two things that I did get tripped up on for just a short while which required some trial and error was that I needed to create a new routing rule in global.asax to handle my Search action url format:

//Search Item
routes.MapRoute(
   "SearchItem",
   "{controller}/{action}/{text}",
   new { controller = "Item", action = "Search", text = UrlParameter.Optional }
);

I also discovered that _Layout.cshtml is essentially the masterpage (called a layout in MVC terms) and is auto-loaded by default from _ViewStart.cshtml.

That’s actually not a bad start for just opening up a project template and getting going, the wizard-style adding of files was very intuitive. Now it’s on to the resources to start learning more.

What Resources I’m Using This Week

Pluralsight: I’m finishing up the free MVC video on ASP.net and like the quality so much I’ve requested for our entire team to purchase seats through work. The catalog is certainly Microsoft-based but also has some “fringe” technology courses as well.

I’m not paid for any recommendation or traffic to Pluralsight, I just like their material.

HTML5 by Bruce Lawson and Remy Sharp: In line with using MVC, I think this is also an appropriate time to embrace HTML5 (as the rest of the world has gone crazy over it) as some of the HTML generated appears to be doctype’d for HTML5 in MVC as well now.

I did enroll in Amazon’s referral program, so there is a slight compensation for purchasing the book after clicking the image below.

Qxtend Query Service, .NET, and Dexter

Work, a new house, young daughter, and watching Dexter from the beginning doesn’t leave a lot of time for writing a software blog. Fortunately, tonight I finished the last episode (terrible ending) and it’s time to get back to writing!

To catch you up to the current season in my storyline; I’ve been working in the enterprise world, the land of three letter acronyms (TLA). I recently asked a few of my co-workers to help me come up with a short list to capture the type of work we’ve been doing, here’s what we came up with:

  • ERP – Enterprise Resource Planning
  • MES – Manufacturing Execution System
  • MRP – Material Requirements Planning
  • CRM – Customer Relationship Management
  • EAM – Enterprise Asset Management
  • PLM – Product Lifecycle Management
  • DHR – Device History Record
  • DHF – Device History File
  • MDR – Master Device Record
  • NCR – Non-Conformance Record
  • ECO – Electronic Change Order
  • BOM – Bill of Materials
  • BPR – Business Process Reengineering
  • ATP – Available To Promise
  • ISO – International Standardization Organization
  • CNC – Computer Numerical Control
  • RMA – Return Merchandise Authorization
  • ROI – Return On Investment
  • CSR – Customer Service Representative
  • DNS – Domain Name Service
  • EFT – Electronic Funds Transfer
  • FTP – File Transfer Protocol
  • JIT – Just In Time
  • POS – Point of Sale
  • CAD – Computer Aided Design
  • CAE – Computer Aided Engineering
  • RFP – Request For Proposal
  • WIP – Work In Process

In addition to all of the TLA’s, we’ve been working with a ton of new (and old) technology which is something that I thought I would post about. What’s interesting about working with many of these ERP-related systems is that if they are not SAP or Oracle, they don’t seem to get a lot of exposure. This makes working on these systems so much harder because information is no longer a Google (or Bing) search away. Not only is the technology older, but the way to access information about the technology is a throw-back to the days of (gasp) reading manuals and asking real people.

The particular ERP system we use is called QAD, which uses a web service based interface called Qxtend to communicate to other systems. Trust me, more than once I tried searching for interfacing with QAD via Qxtend and .NET, but to no avail. Working with Qxtend as a .NET developer is extremely different than consuming typical web services, at least for us at this stage in our understanding. Instead of adding a service reference or using WCF or similar framework, we had to do this an older more manual way which I will outline below. It does seem to work quite reliably, but there were several gotcha’s both in setting up the Query Service on the part of Qxtend and consuming it from .NET. All code on the .NET side can be used to connect to other SOAP based web services, so if you aren’t using QAD, don’t worry – you can still get a bit out of this code.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Net;
using System.Xml;

namespace QxtendSoapCall
{
    class Program
    {
        static void Main(string[] args)
        {
            //Prep information related to Qxtend 
            //(these values are made up for demo purposes)
            string url = "http://qad-qxtend/services/QDocWebService";
            string receiver = "qad_rcv";
            string domain = "abc";
            string version = "v1_1";
            string sourceApp = "abc";
            string profile = "GetCustomerReps";
            int maxRows = 0;  //unlimited

            //Set filter for query service (optional)
            //(these values are made up for demo purposes)
            string part = "ABC-123";
            string filter = string.Format("pt_part = {0}", part);

            //Load generic xml request template from external file
            //(create an empty QueryService QDOC request to begin and add placeholders)
            string templatePath = @"C:\XmlData\QueryService.xml";
            string requestXML = string.Empty;
            using (StreamReader streamReader = new StreamReader(templatePath))
            {
                requestXML = streamReader.ReadToEnd();
            }

            //Replace template values with values for this query service
            requestXML = requestXML.Replace("{receiver}", receiver);
            requestXML = requestXML.Replace("{domain}", domain);
            requestXML = requestXML.Replace("{version}", version);
            requestXML = requestXML.Replace("{sourceApplication}", sourceApp);
            requestXML = requestXML.Replace("{profile}", profile);
            requestXML = requestXML.Replace("{filter}", filter);
            requestXML = requestXML.Replace("{maxRows}", maxRows.ToString());

            //Clean up template
            requestXML = requestXML.Replace("\n", "").Replace("\r", "");

            //Prep service call variables for qxtend
            WebRequest request = null;
            WebResponse response = null;
            string xmlResponse = string.Empty;

            try
            {
                //Prepare web request
                request = WebRequest.Create(url) as HttpWebRequest;
                request.Method = "POST";  //post method
                request.ContentType = "text/xml";  //xml
                request.Headers.Add("soapaction", url);  //soapaction header
                request.Headers.Add("Synchronous", "Yes");  //synchronous
                request.Timeout = 30000;  //30 seconds timeout expiry

                //Encode xml string into byte array
                byte[] byteData = Encoding.UTF8.GetBytes(requestXML);
                request.ContentLength = byteData.Length;

                //Post byte array
                using (Stream postStream = request.GetRequestStream())
                {
                    postStream.Write(byteData, 0, byteData.Length);
                    postStream.Close();
                }

                //Get web response
                response = request.GetResponse() as HttpWebResponse;

                //Pull response into stream
                Stream stream = response.GetResponseStream();

                //Read stream
                StreamReader reader = new StreamReader(stream);
                xmlResponse = reader.ReadToEnd();
            }
            catch (WebException webEx)
            {
                //TODO: Handle your web exceptions here
            }
            catch (Exception ex)
            {
                //TODO: Handle your general exceptions here
            }

            //Convert string to XmlDocument (or XDocument)
            XmlDocument xdoc = new XmlDocument();
            if (!string.IsNullOrEmpty(xmlResponse))
            {
                xdoc.LoadXml(xmlResponse);
            }

            //TODO: Do something with XML now that you have data from QAD
        }
    }
}

That’s it, now you’ve got a way to generically call QAD Qxtend Query Service from .NET without needing an XSD or creating service references to .NET. This is fairly new for me, so if you see any bugs or better approaches, please leave a comment!