This article was developed by the editors of Cadalyst, a magazine and web site devoted to providing software and hardware information, advice, and tips for CAD managers and users. It is published here with permission of the publisher.
If your company is considering the deployment of new CAD tools, doesn’t it make sense to test the software thoroughly before pushing it to the entire company? Of course! But how, exactly, should you carry out such testing?
In this whitepaper, I’ll lay out my best practices for software trials — a step-by-step list of how to ensure a thorough, controlled, successful test process. Think of the trial deployment as a dress rehearsal for the eventual real-world implementation you’ll perform, and you’ll be amazed how much you’ll learn. Let’s get started.
Step 1: Set Up a Trial Infrastructure
As a CAD manager, you need to know your team: what they can and cannot do with CAD, when they need vacation time, when they need support, when they need training. The list goes on. This can be like the proverbial “herding cats” analogy. You have to juggle, you have to be flexible, but also know your team. Learn their individual strengths, build on their weaknesses to help them improve. Make sure you schedule work to the appropriate CAD team member. Long-term work should go to the slower drafters, while tight deadline work is given to the faster drafters.
Don’t be afraid of wearing lots of different hats—it will happen. Plus, it will enhance your all-round knowledge and make you a better manager, not just of CAD, but also of your team. Always manage to the best of your ability, be knowledgeable, and most of all, be approachable. Don’t sit in an ivory tower looking over your empire. Be part of that empire and enjoy the experience!
Step 2: Establish a Proving Ground
Aircraft design companies use proving ground to test designs in real-world conditions before ever delivering an aircraft to a customer. The proving ground concept supports rapid but extensive testing in a controlled environment, where engineers can catch errors early as well as update the design easily as they receive feedback from test pilots.
In a CAD context, the proving ground is simply a group of test users — I call them test pilots — who put new software through a controlled barrage of testing to find out which key features to pursue, what works and what doesn’t. I view the proving ground as more than a place to purposely crash and debug the software; rather, it’s a means to verify installation deployments, change standard work processes, update standards, devise training strategies, and maximize user comprehension. In short, it’s the place to battle-test new software and get it ready to implement. Ultimately, if the trial software doesn’t meet your company’s needs, the proving ground is the ideal place to figure that out instead of presiding over a failed trial that involved many more users.
Step 3: Identify Your Test Pilots
The proving ground will only be as good as the test pilots you recruit. These users must possess the following traits to be successful test pilots:
strong desire to learn new software
calm under pressure
ability to communicate problems clearly
perseverance, good follow through
Software test pilots are a special breed of user who realize they’ll be trying out new tools that will present challenges, possibly even crash, yet are still excited to be a part of the process to prepare the software for production.
Give me a few test pilots with these attributes and I promise I’ll be able to evaluate new CAD software and make it work. Without these test pilots, I’ll have to release new software to my general user population, which likely will panic when confronted with anything that doesn’t work perfectly the first time. Does that latter scenario sound familiar to you?
Admit it: You already know who your test pilots are, don’t you?
Note: Many CAD managers, especially in smaller firms, might be tempted to serve as the solo test pilot for a software trial. I recommend against this practice unless it’s your only reasonable option. Having the perspective of at least one other user is invaluable in uncovering problems with software before you deploy it for production.
Step 4: Create an Isolated Production Environment
Your test pilots should receive any software and customized setup on their machines configured in a way that they can always revert to standard CAD tools if project demands dictate. (Think of this as the ejector seat that allows them to escape the new CAD tool if a project crash is imminent.)
Further, isolate the proving ground so you only have a select number of projects that are using experimental software. The goal is to prove new software on a working project but to keep the risk profile low in the event of data corruption, version conflicts, or other unforeseen difficulties.
Finally, set up the proving ground to deliver new software exactly as it would work in a production environment so you can debug the installation files and network concurrently with software debugging.
Step 5: Interview Your Test Pilots
Check in with your test pilots as they evaluate the trial software, and be ready to learn from their experiences. Always ask the following:
What problems did you have?
Which symptoms did you notice as errors occurred?
What was confusing and what worked well?
What would make the software easier to use?
How would you explain your experiences to other users?
Called debriefing in proving ground environments, this process is the fastest way to generate accurate user feedback. The information you glean will not only help you debug the software but also create training materials for eventual production users. Take detailed notes of your debriefing sessions to document what you learned for reference later.
Step 6: Improve, Iterate, and Repeat
Now that your test pilots have shared their feedback, it is time for you to adjust your software accordingly, document those changes for training purposes later, tweak the proving ground environment as needed — and repeat the process! Each iteration of the process should run more smoothly than the last as you continually improve your trial deployment to arrive at a point where your test pilots agree the software is ready to go.
You might be tempted to rush the deployment process by bypassing additional testing; however, remember that every iteration will make the software that much easier for users to learn and use. More test pilot missions will pay off later in reduced training time.
Step 7: Make Your Test Pilots Heroes
As your test deployment progresses, buzz around the office will inevitably grow louder. This is the time to praise your test pilots publicly. When you call attention to the test pilots as being hard working, sharp, talented users who are helping the company get ahead with new technology, you’ll establish a culture that encourages users to pursue the test pilot honor. Wouldn’t it be awesome if all your CAD users strived to be as good as your current test pilots? How much easier would it be for you to train and implement new CAD tools if everybody had a test pilot mindset?
As You Go: Manage Rogue Software Trials
Do you have users who like to undertake their own software trials? Some users might find and download tools at random, while others might hear the buzz about a tool you’re evaluating and undertake their own testing outside your tightly controlled proving ground. Although it’s great to have users who are interested in finding new and better tools to get the job done, these “rogue test pilots” can sometimes cause more problems than you’d imagine, including:
Increased burden on the CAD manager. The many questions and problems that inevitably arise from unauthorized testing will distract you and take time away from your controlled tests and other priorities.
Diversion from standards. If these rogue pilots propose work methods that don’t jive with existing company standards, you might end up fixing the problems that result.
Increased errors. When operating in an uncontrolled environment, your rogue pilots may inadvertently write over project files, create versioning problems, or otherwise corrupt company data. Although such errors aren’t malicious, they are costly nonetheless.
Increased overhead. It’s one thing to spend overhead time conducting a controlled test of new software; it’s even harder on your budget and productivity when rogue users jump into the fray. To minimize testing costs, you must tightly control the number of test pilots and time spent.
Without discouraging users from learning something new, you should establish some hard and fast rules to avoid rogue software testing.
If a user identifies a tool that he or she would like to evaluate, the first course of action should be to bring it to the CAD manager to be considered for controlled testing.
If you can’t support a controlled test for any reason, you can encourage the user to test the tool independently, with the following conditions:
Testing must be done on a user’s own time.
It must not require IT or CAD management support.
It must be isolated from company networks.
It must adhere to company standards if the tool is to be considered for production use.
By making these policies clear, you can avoid problems associated with unauthorized test pilots while still allowing self-motivated users to explore new tools.
Step 8: Build Your Training Plan
As test pilots run new CAD tools through the proving ground, you should glean a wealth of information that will help you during mass implementation later. If you pay attention and take good notes throughout testing, you should learn
which concepts to stress,
how to best explain difficult concepts,
which user problems to anticipate, and
which work methods and standards worked best.
With all this in mind, you can start to draw conclusions about how you’ll train production users, how you’ll administer the software, and which standards and best practices you’ll need to make the new software run best.
Following the trial deployment best practices outlined in this whitepaper, you should wind up with software that is stable, easier to use, and easier to teach. Although test pilots and proving ground may have originated in the aircraft industry, I think you’ll find the methods work just as well for evaluating all manner of CAD tools. And, it turns out, the proving ground isn’t just a place to get software running; it is also a usability lab where you can optimize the software for all future users.
Robert Green performs CAD programming and consulting throughout the United States and Canada. He is a contributing editor for Cadalyst magazine and the author of Expert CAD Management: The Complete Guide. Reach him via his web site, www.cad-manager.com.
Stay connected on the latest tips, advice and news for CAD Managers.