Tools We Use

This page uses a pretty broad definition of “tool” as anything we use to help us perform a job or carry out a particular function. In our lexicon, a tool can be anything from an engineering process to a computer language to a structured workshop format. Here's a sampling:

graphical element

Augmented Reality

Augmented reality is the newest tool to emerge from the convergence of cloud computing and mashups. Where virtual reality creates a simulation of the real world, augmented reality adds information to a live direct or indirect view of the real world. Virtual computer-generated imagery (which can include text) is merged with the real-world image to create a mixed (augmented) reality.

A primitive example is the flashing of sports scores over the live view of a game on TV. With mobile and cloud computing, and advanced AR technology such as computer vision and object recognition, the real world surrounding the user becomes interactive and digitally usable to guide the retrieval of a layer of information that can be overlaid on the real world view. In this way, contextual data can be brought in to deepen the user's understanding of the subject.

A new and very sophisticated use of augmented reality is the projection of imaging data from an MRI onto a patient’s body to pinpoint a tumor and guide the surgeon in its removal. Somewhere in between sports scores and guided surgery are applications like adding an audio commentary and related imagery to the view of a historic building to make a user’s experience of the place more meaningful, superimposing the project timeline and budget status over a realtime video capture of a construction site, or overlaying a map of a school district with locator pips that show the progress of school buses along their routes. The uses of augmented reality are virtually infinite, and the range of tools available to create it makes applications affordable to almost anyone.

graphical element

Computing Technology Tools

It’s called Information Technology, Computers, Management Information Systems — whatever your favorite term for it, it’s ubiquitous in business process automation and in everyday life, and selecting the right tools in this area is a particular challenge for organizations whose core competency is not in this arena. There are a few techniques that we feel are particularly important and useful and that we tend to recommend our clients use. Of course, our recommendations depend on what the client is trying to accomplish, and they may change as new technology emerges.

Parametric Multi-Dimensional Modeling
Parametric modeling is a well-established dynamic engineering and design technology that provides enormous efficiencies in engineering development through the interface between a set of rules (the parameters) and a virtual 3D representation of structures or environments (the model). Until recently, the software and hardware tools were prohibitively expensive for less than multi-hundred-million dollar projects, but new open source software and powerful PCs have made it affordable for anyone. Parametric modeling is used for pre-production simulation that validates design objectives, projected ROI and adherence to standards as well as post-production monitoring and ROI verification. Read more about integrated parametric multi-dimensional modeling here.
Integrated Modeling
Integrated Parametric Modeling and Integrated Information Modeling is about making things easily visible so they can be better managed. The models draw related data together to show the whole picture of the health of a process, organization or project. Through a single point of entry, you can drill down to the narrowest level of detail, trace dependencies and follow trends. Information, structures, projects can all be managed through an interface that rapidly reveals through recognizable patterns what’s working and what isn’t. On the back end, data is mined from inside and outside sources and updated in real time to show influences and correlations; correlations become active parameters, making it easy to develop projections and create “what if” scenarios.
Web-based mashups use the user’s Web browser to combine, integrate and reformat disparate data from a diverse set of sources, independent of the source platforms. A typical use might be to overlay information pulled from one or more data bases on a map or virtual geography, so that when the user clicks on a particular location, the relevant information about that location will be displayed on the same page. Usually, we find web-based mashups are easier, faster and require fewer resources to implement than server-based mashups or portals, which analyze and reformat the data on a remote server and deliver it to the user's browser in its final form.
Cloud Computing
Strictly speaking, cloud computing is a conceptual approach or paradigm, not a specific tool. The term refers to the use of the Internet to access dynamically scalable and often virtualized resources, access to which is provided as a service. The advantage of cloud computing is that users (or the users’ IT departments) do not need know, understand or manage the technology infrastructure that is “serving up” the information through the “cloud.”

Engineering Processes

Structured processes borrowed from different engineering fields can be very useful in any field.

Systems Engineeringa system is supported by elements
Systems engineering is a holistic, interdisciplinary approach to the design of technologies, processes, products or structures that proceeds from concept to production to operation through a structured development process, integrating business objectives and technical expertise to develop quality solutions that meet user needs. This approach recognizes that each system is an integrated whole composed of diverse, specialized elements and aims to optimize the overall system functions by maximizing synergy among its elements.
Usability Engineering
Outside of the software development industry, usability engineering is often called user interface engineering/design. Whatever it's called, it is an organized, systematic approach to ensuring a process or product will be attractive and useful to the intended user when complete. Albert Einstein put it this way: “Concern for the man himself and his fate must always form the chief interest of all technical endeavors.” This pronouncement elegantly states what we consider the obvious, and what we consider the driver of our business: we simply say that every design or engineering project begins and ends with the user. This approach can be very useful when applied to internal process re-engineering to ensure rapid and effective adoption leads to the intended improvements in efficiency. Typical steps include functionality requirements collection and definition, user analysis, systems architecture, prototyping and pilot program launch.

Analytical Tools

7-Step Process Improvement

  1. Document what employees say they do
  2. Audit the process to see what they are doing
  3. Document the process, identifying re-work
    loops, redundancies, bottlenecks, manual
    processes that could be automated
  4. Lean the process
  5. Document the new process
  6. Train to the new process
  7. Go back to step 1.

The method is iterative until no further improvements
can be identified, and should be repeated periodically
as part of regularly scheduled internal continuous
improvement audits.

Often the first task we undertake for a client is an Independent Program and Project Analysis (IPPA). Strictly speaking, an IPPA is a product, not a tool, and it’s the first step in a Business Process Improvement initiative. We call it a tool because our clients will typically use the report as a continuing reference as the develop plans to improve both efficiency and effectiveness within their organizations. In preparing an IPPA, we use quality assurance audit protocols to perform a business process audit, typically on top-level processes and selected critical procedures. The areas of focus are determined in collaboration with the client, based on known problems or critical needs. We evaluate the effectiveness of a process and the quality of the output by working through the first three steps of our seven-step process improvement protocol. Our findings and recommendations include potential ways to “Lean” the process, that is, to eliminate steps that don’t add value. Another technique we borrow from the quality assurance toolbox is root cause analysis, a process that examines a problem by asking, not just what it is, but why it is occurring and what are the causal relationships associated with it. The final IPPA describes the context and assumptions around a catalog of findings and recommendations. A typical IPPA will recommend changes that can save 5 to 20 percent of an organization’s operating budget while providing measurable improvements in the quality of goods or services.