David Chapman - Chapman Consulting
I am an expert witness and experienced contract software developer in San Jose, California who has written over 1,300,000 lines of code in my 34 years of industry experience. My expert witness specialty is GDS/GDSII/OASIS data file review for evidence of patent infringement or IP misuse. My software development specialties are in compute-intensive algorithms, Electronic Design Automation (EDA), and parsers/compilers.
I provide algorithm design, development, testing/debugging, and optimization services for high performance, high value, and high reliability software. If you don't know how to get your software project started, need help writing or testing it, or need to improve it, I can help.
After 1,300,000 lines of code, I've learned a few surprising things about writing quality software. I have written a series of essays about software development, testing, and optimization. A few are featured below. Read some of the essays, then call or write to find out how I can help you deliver code that works - and works well.
Software Development Done Right.
A Line of Prevention Is Worth Ten Pages of Cure: Coding Strategies to Reduce Errors in New Code
Everyone expects new code to have a higher bug rate. Software goes through "alpha" and "beta" releases before being sold to the general public; even early production releases have significantly higher patch rates than older code. Customers accept this in return for being the first to see a new system. But that doesn't mean it's a good thing!
Typically the release date of a first production version is based on reaching a targeted bug rate. The number of bugs per customer will drop over time until it reaches an acceptable threshold. Normally the bug rate will decrease smoothly and rapidly at first. This is a good thing, of course, but it is the result of a simple truth: most bugs in new code are trivial.
Read more: A Line of Prevention Is Worth Ten Pages of Cure: Coding Strategies to Reduce Errors in New Code
Optimizing with ^C
Quickly Getting Up to Speed
When you are asked to speed up a program, the temptation is to launch it using a runtime profiler such as gprof (the Free Software Foundation tool) or IBM's Rational Purify. Typically profilers instrument a program, then run it under the profiler's control. They may measure the length of time taken for each instruction sequence, or they may interrupt the program many times per second and count the number of times the program is in each portion of the code. They may even do both.
Runtime profilers can be very useful, but your program will run much slower under test than it normally would. For software that already takes minutes or hours, this can be quite a burden. There is additional overhead for function calls and returns, so well-structured code with many small functions may slow down still more.