Wednesday, July 25, 2012

The Meaningful Use of Health Informatics



This week in class we had the pleasure of hearing Dr. Tom Savel talk about his research lab at the Center for Disease Control.  He has his background in medicine; he has an MD in general surgery and general practice.  But he is also a technology enthusiast.  His lab helps develop technology products (websites, mobile apps, etc) that can be utilized by doctors and public health employees.


He introduced the concept of "meaningful use" in terms of technology and eRecords in Public Health and Medicine.  According to the U.S. Department of Health & Human Services and the CDC, the concept behind a meaningful use of technology in the health fields is still being defined.  But Dr. Savel's group is trying to help shape the future of technology at the CDC, being a leader in the field of health informatics.


You can see more about Tom's research lab group and what they are trying to accomplish on their website.






Tuesday, July 10, 2012

Following the Rabbit Down the Google Map Trail



Google recently hosted their Google I/O Conference in San Francisco.  You can find more about this conference on their website.  I was very interested in hearing any new innovations to Google Maps as that is probably the most interesting product that they've developed throughout the years, especially considering the "stiff" competition that was already on the scene.  Brian McClendon, Dylan Lorimer, and Thor Mitchell presented on this topic.


Google Maps is highly integrated into how we interact with our world.  It's embedded into websites; it's utilized on our mobile devices; it's on every evite that's sent.  It enables us to create routes for vacation or learn a new environment on a whim like you were a local.  Street View lets you actually see what's at the location that you are researching so that you won't be surprised with construction or a closed restaurant.


Having an integrated map functionality that Google can build into it's search engine is just good business.  Maps, latitude and longitude, images, points of interest, they are all just data points, and that is what Google does best, is render data that is comprehensive, accurate, and highly visible in our every day world.  This product fills a need that we have to know where we are going and to find new places to go.  The world today is "on the go".  And Google has done a wonderful job of hearing what their users need and filling the gap.  They've added the Street View navigator, upped the quality of their maps, and built out Map Maker to improve their developers apps for things like hiking trails.


Google Maps is used now in the the business world.  Businesses build out there web presences and custom them to properly present their business to the world through Google and Google Maps.  Businesses have a strong concern that the people who are looking for them can find them easily and accurately.  Google Maps can also completely revamp a business if that business deals a lot in various locations such as State Parks with hiking and biking trails or a tractor trailer company who wants to find the best routes for its truckers that takes traffic and construction into account.  Google Maps has been around for years, and I expect it to be around for years to come.  I look forward to seeing how it evolves!


Tuesday, March 27, 2012

Fast Times at "Big Data" High



"Big Data" is showing up more and more in the Information Technology News so you might be asking yourself "What is Big Data"?  Big Data is the representation of extremely large data sets and the management of these data sets using new and improved systems that are specifically made to handle these data sets quickly.  These data sets are usually filled with auto-generated data that was collected using sensors or cameras. [1]

One of the tools generated to work with Big Data are NoSQL databases.  With the large amount of data manipulation going on, a new way of data storage and access needed to be created.  NoSQL databases focus on retrieving items from strorage efficiently and in real-time. [2] A few of the pros of using NoSQL databases includes elastic, transparent scalability, and more relaxed data model restrictions making data model changes easier to implement. [3]  NoSQL databases though still have a ways to go as the next generation of technicians are struggling to keep up and it's far from a hands-off solution right now.


With the introduction of new "databases" in the market also welcomes new management packages like Hadoop, Pig, and Hive.  Hadoop "is a framework that allows for the distributed processing of large data sets across clusters of computers using a simple programming model." [4]  Hadoop is comprised of two parts: the Hadoop Distributed File System and high-performance parallel data processing. [5] Pig and Hive both are related to Hadoop, and it's functionality.  Pig is "a high-level data-flow language and execution framework for parallel computation," and Hive is "a data warehouse infrastructure that provides data summarization and ad hoc querying." [6]
So why is "Big Data" important? I believe this quote from a recent article sums it up: "The amount of data in our world has been exploding, and analyzing large data sets—so-called big data—will become a key basis of competition, underpinning new waves of productivity growth, innovation, and consumer surplus, according to research by MGI and McKinsey's Business Technology Office." [7]  Big Data is where the future is as more time passes and more equipment are being developed to collect more and more data from the world.  The faster you can manipulate that data and get it back out into the world, the more successful you will be going forward.  The age of starting a program and walking away from a few days until it's done running are over and no longer acceptable to standards in today's age.


More Information about Big Data in the Media Recently:
Adobe Unleashes Predictive Analytics on Big Data Complexity
Will Google Big Query Transform Big Data Analysis?
Customers, Big Data, and the Internet of Things
Mobile, social and big data drive cloud computing boom: studies
Microsoft destined to follow in big data



Wednesday, March 21, 2012

For Your 'Leaner' Wednesday



ALT Tag: "Our company is agile and lean with a focus on the long tail. Ok, our company is actually a polecat I found in my backyard."


Brought to you by the awesomeness that is XKCD: http://xkcd.com/1032/

Wednesday, March 14, 2012

The "Leaner" Side of Life

Lean, or Lean Thinking, or the Toyota Way, is a concept that defines a general process of 'continuous improvement' and 'respect for people'.  This process can be applied to many situations, but this article mostly describes it in terms of production of a service.  Part of the 'continuous improvement' aspect, also known as kaizan,a is to allows challenge everything and to always be open to change.  Toyota defined this as "always being dissatisfied with the status quo."


The 'Lean-Thinking House' visually shows these two concepts working in tandem to build a solid foundation for product development and management.  The top tier, aka the 'roof', represents the goal trying to be accomplished.  On the bottom, you have the 'foundation', which represents solid management that applies and teaches lean thinking to their teams and base their decisions off of lean long-term thinking.  On the two sides, you have the two pillars that define lean thinking.  And in the middle, you have the heart that is product development and the 14 principles.  The 14 principles outline the various principles that have been developed by Toyota over all their decades of product development and include items such as 'move toward flow' and 'grow leaders from within'.  The product development area represents focusing "on creating more useful knowledge and learning better than the competition."  Then using that knowledge and making sure that it does not go to waste.  Lean thinking is about creating high-value, low-cost information to utilize in their development processes as shown below.



As for my thoughts on the lean methodology, I think it's an interesting concept and would be open to working in such an environment.  It definitely has it's benefits as a developer especially since your employer is actually trying to enhance your knowledge and to maintain your loyalty as an employee.  I'd be curious to see how this might be implemented in my work environment in education.  Many times we are 'firefighters' in addition to our development duties.  Just how this would work out in an environment where each individual wears many hats from customer support, to system administration, to development, to management and administrative duties, I don't know.  Would this only work in a profit driven company?  How easily could this be accomplished with other organizations?  There actually is a group on campus currently training up on Lean Six Sigma; it will be interesting to see what they do with this knowledge and if they are successful in implementing it and getting buy-in from their management.  I'll update you hear if I learn more about their implementation.  That's one thing I wish this primer would have delved deeper into, is how this might work and does currently work in companies another than Toyota.






Thursday, March 1, 2012

"Hammer"ing Out the BPM System

According to Michael Hammer, "Business Process Management (BPM) is a comprehensive system for managing and transforming organizational operations, based on what is arguably the first set of new ideas about organizational performance since the Industrial Revolution."  In his article, "What is Business Process Management?", he discusses topics such as it's origin and its development.  Hammer credits the work of Shewhart and Deming as the first to approach BPM with statistical analysis of problems occurred during production; this methodology was too generally defined for Hammer's taste.  His work then is the second definitive work on the subject matter.


Hammer then goes on to discuss the process management cycle, shown below, which was eventually developed by merging the two definitive methodologies of himself and Shewhart and Deming.  This cycle is derived from Deming's Plan-Do-Check-Act cycle.  Hammer then goes on to discuss how "through process management, an enterprise can create high-performance processes, which operate with much lower costs, faster speeds, greater accuracy, reduced assets, and enhanced flexibility."  And he goes on into the operational benefits that might be realized through such as process.  But he also notes that this road is not always easy; there are certain processes that must be changed to allow BPM to flourish: process design, process metrics, process performers, process infrastructure, and the process owner.  Hammer then continues on to discuss the items that need to be able to change to be a success such as leadership, governance, culture, and the expertise.



As a summary, Hammer reviews the main principles of process management: All work is a process.  Any process is better than no process.  A good process is better than a bad process.  One process version is better than many.  Even a good process must be performed effectively.  Even a good process can be made better.  Every good process eventually becomes a bad process.  After that Hammer begins to discuss how Enterprise Process Model (EPM) can be used a graphical tool to enhance BPMN.  An example is shown in the graphic below.  Lastly, Hammer gives his ideas on where BPM might be going in the future.





In my opinion, Hammer definitely is one of the frontiersmen of BPM.  While his writing can be a little hard to follow, his methodology is not.  It's quite simple and has a bit of lean thinking's 'challenge everything' mantra.  He describes most things in three phases: what it was doing, what it is doing, what it will be doing.  Always a continuous improvement process.  It will definitely be interesting to see where the BPM methodology will go in the future and what it's adoption rates will be.

Tuesday, February 28, 2012

Visionary Leader of BPM: Adobe LiveCycle




Product Review:  Adobe LiveCycle


Slogan:  "Automate processes and improve communications"



URL:  http://www.adobe.com/products/livecycle/


Summary: [Summary Card]


Adobe LiveCycle is made up of 3 business process management tools:  Process Management, Business Activity Monitoring, and Content Services.  The business process management tools are only a portion of the entire package, which also includes an RIA portion (rich internet application) and Document Services for communication, forms, and security features.


"Streamline your enterprise business processes" [1]  The Process Management suite is extremely versatile and contains many ideal capabilities such as a visual process and UI design environment, a centralized repository with versioning, and a "design once, deploy anywhere" work flow.  The PM suite also also for complete customization via Flex programming for the more tech savvy developers but still has advanced code-less deployment as well.

"Maximizing Business Performance with BAM" [2]  The Business Activity Monitoring suite is also completely customizable and has event-driven decision making.  BAM is made up of a dashboard, a workbench, an Analytics Server, and rights management that all work together in tandem.  BAM uses SQL-based semantics to combine events and content in the workbench to meet deliverables in the dashboard.

"Share, Manage, and Retain Content" [3]  The last big piece for BPM in LiveCycle is the Content Management system, which allows you to manage content through enterprise libraries and content connectors.  It enables you to streamline your content reviews and archive content based on retention policy.  And since it's an Adobe product it enables the auto-transformation of content into Adobe PDF files.

(Pictures courtesy of http://www.liventerprise.com/tool/Adobe_LiveCycle_ES/)

Tuesday, January 17, 2012

To Design Software, or to Design Software? The Comparison of Two Common Software Engineering Processes

Last week in my Software Systems Analysis class, we looked at two different perspectives on the Software Engineering process:  one proposed by our Satzinger textbook and one outlined in SWEBOK, the Software Engineering Body of Knowledge.  Below I have matched up the functional areas that I think are related with Satzinger being in red and SWEBOK being in blue.


Functional Areas that Line Up:

  1. Satzinger's first step is to "Identify the problem or need and obtain approval."  I believe this loosely maps to SWEBOK's "Software Engineering Process" phase, which covers "the definition, implementation, assessment, measurement, management, change, and improvement of the software engineering process itself."
  2. The next step in Satzinger's process is to "Plan and monitor the project", which I think binds closely to SWEBOK's "Software Engineering Management" that "addresses the management and measurement of the software engineering."
  3. The next step is to "Discover and understand the details of the problem or need."  SWEBOK's "Software Requirement" maps nicely to this section. 
  4. Satzinger has next to "Design the system components that solve the problem or satisfy the need," which can map to "Software Design" in SWEBOK.
  5. The next step is to "Build, test, and integrate system components."  This lines up with SWEBOK's "Software Construction" phase.
  6. The final phase that lines up between the two is to "Complete system tests and then deploy the solution."  This lines up with "Software Testing" in SWEBOK.

While these are very loose comparisons, there are some aspects of SWEBOK that are not covered by Satzinger or perhaps Satzinger combines them into other functional areas.  For example, SWEBOK details a section on "Software Maintenance."  This section covers what happens to the product after it has gone into production.  Another example is "Software Configuration Management."  This section defines each configuration change so that it can be traced throughout the lifetime of the product.  A third example is "Software Engineering Tools and Methods."  Satzinger does not really have a specific section on the tools and methods used but discussing both through out all his sections.  Finally, "Software Quality" is the last section not fully discussed in Satzinger's version, but just life with the tools and methods section, this too is discussed throughout each section of his process.


While this is a quick and dirty comparison, you should really review all methodologies before starting a project to verify that it will work in your environment with your team.



Friday, January 13, 2012

A Silver Bullet to the Demise of Moore's Law?

This week in my Systems Analysis class we discussed software engineering and the processes one might take in developing a software package based on the Agile fundamentals.  We also discussed briefly Moore's Law and how it's slowly coming to an end.  Although, I did find this article today announcing that IBM is still making leaps and bounds in terms of hardware development.  I guess someone did not tell IBM's techs that they should sit back and relax a while.


So, what do I foresee as being a possible "silver bullet" to combat the fatigue in Moore's Law?  The first thing we can do is to start a science and engineering education initiative in our public schools.  I didn't know anything about engineering before I went to college to become one.  I just knew that I was good at math and science and that computer engineering would be a good place to apply those skills.  But we really need to educate the next generation on what's possible and what real-life software engineers can accomplish.  We need to build momentum!


The next thing that could dramatically change the software development process is to gain buy-in from the community at large.  There are so many types of software development plans out there, and not all of them are applicable to certain situations.  So many companies get into ruts with their software development planning, if they have a plan at all, that they don't not see (or know) that there might be a better solution out there.  We need more places like TopCoder to really build momentum and educate the community around the cause.


The final item that could be a game changer is if a fundamentally new hardware product gets released on the market.  With the development of a new synthetic compound or with a change in how hardware and software interact, the bedrock of software development could essentially change or become obsolete.


In the immediate future though, I don't see there being a "silver bullet" to the werewolf of stalled innovation.  We will need to make significant changes to our culture to encourage and open doors to math and science for all ages, races, and genders.  Or another Steve Jobs.  Or a major break-through in hardware platforms.