Computing Technology Tools
It’s called Information Technology, Computers, Management Information Systems — whatever your favorite term for it, it’s ubiquitous in business process automation and in everyday life, and selecting the right tools in this area is a particular challenge for organizations whose core competency is not in this arena. There are a few techniques that we feel are particularly important and useful and that we tend to recommend our clients use. Of course, our recommendations depend on what the client is trying to accomplish, and they may change as new technology emerges.
- Parametric Multi-Dimensional Modeling
- Parametric modeling is a well-established dynamic engineering and design technology that provides enormous efficiencies in engineering development through the interface between a set of rules (the parameters) and a virtual 3D representation of structures or environments (the model). Until recently, the software and hardware tools were prohibitively expensive for less than multi-hundred-million dollar projects, but new open source software and powerful PCs have made it affordable for anyone. Parametric modeling is used for pre-production simulation that validates design objectives, projected ROI and adherence to standards as well as post-production monitoring and ROI verification. Read more about integrated parametric multi-dimensional modeling here.
- Integrated Modeling
- Integrated Parametric Modeling and Integrated Information Modeling is about making things easily visible so they can be better managed. The models draw related data together to show the whole picture of the health of a process, organization or project. Through a single point of entry, you can drill down to the narrowest level of detail, trace dependencies and follow trends. Information, structures, projects can all be managed through an interface that rapidly reveals through recognizable patterns what’s working and what isn’t. On the back end, data is mined from inside and outside sources and updated in real time to show influences and correlations; correlations become active parameters, making it easy to develop projections and create “what if” scenarios.
- Web-based mashups use the user’s Web browser to combine, integrate and reformat disparate data from a diverse set of sources, independent of the source platforms. A typical use might be to overlay information pulled from one or more data bases on a map or virtual geography, so that when the user clicks on a particular location, the relevant information about that location will be displayed on the same page. Usually, we find web-based mashups are easier, faster and require fewer resources to implement than server-based mashups or portals, which analyze and reformat the data on a remote server and deliver it to the user's browser in its final form.
- Cloud Computing
- Strictly speaking, cloud computing is a conceptual approach or paradigm, not a specific tool. The term refers to the use of the Internet to access dynamically scalable and often virtualized resources, access to which is provided as a service. The advantage of cloud computing is that users (or the users’ IT departments) do not need know, understand or manage the technology infrastructure that is “serving up” the information through the “cloud.”
Structured processes borrowed from different engineering fields can be very useful in any field.
- Systems Engineering
- Systems engineering is a holistic, interdisciplinary approach to the design of technologies, processes, products or structures that proceeds from concept to production to operation through a structured development process, integrating business objectives and technical expertise to develop quality solutions that meet user needs. This approach recognizes that each system is an integrated whole composed of diverse, specialized elements and aims to optimize the overall system functions by maximizing synergy among its elements.
- Usability Engineering
- Outside of the software development industry, usability engineering is often called user interface engineering/design. Whatever it's called, it is an organized, systematic approach to ensuring a process or product will be attractive and useful to the intended user when complete. Albert Einstein put it this way: “Concern for the man himself and his fate must always form the chief interest of all technical endeavors.” This pronouncement elegantly states what we consider the obvious, and what we consider the driver of our business: we simply say that every design or engineering project begins and ends with the user. This approach can be very useful when applied to internal process re-engineering to ensure rapid and effective adoption leads to the intended improvements in efficiency. Typical steps include functionality requirements collection and definition, user analysis, systems architecture, prototyping and pilot program launch.
7-Step Process Improvement
- Document what employees say they do
- Audit the process to see what they are doing
- Document the process, identifying re-work
loops, redundancies, bottlenecks, manual
processes that could be automated
- Lean the process
- Document the new process
- Train to the new process
- Go back to step 1.
The method is iterative until no further improvements
can be identified, and should be repeated periodically
as part of regularly scheduled internal continuous
- Often the first task we undertake for a client is an Independent Program and Project Analysis (IPPA). Strictly speaking, an IPPA is a product, not a tool, and it’s the first step in a Business Process Improvement initiative. We call it a tool because our clients will typically use the report as a continuing reference as the develop plans to improve both efficiency and effectiveness within their organizations. In preparing an IPPA, we use quality assurance audit protocols to perform a business process audit, typically on top-level processes and selected critical procedures. The areas of focus are determined in collaboration with the client, based on known problems or critical needs. We evaluate the effectiveness of a process and the quality of the output by working through the first three steps of our seven-step process improvement protocol. Our findings and recommendations include potential ways to “Lean” the process, that is, to eliminate steps that don’t add value. Another technique we borrow from the quality assurance toolbox is root cause analysis, a process that examines a problem by asking, not just what it is, but why it is occurring and what are the causal relationships associated with it. The final IPPA describes the context and assumptions around a catalog of findings and recommendations. A typical IPPA will recommend changes that can save 5 to 20 percent of an organization’s operating budget while providing measurable improvements in the quality of goods or services.