Contrary to what many programmers think, QA role is not to do the dirty work for them. QA’s role is to validate, independently, that the code actually works.
The reason I put the responsibility on the coder is simple. The coder is the one who writes the code, the one that understands it and the one that can change it. Why should anyone else be the owner ?
QA has a lot less options for proving the code works and reducing the risk than the developer, they can only test the functionality from a black box perspective.
Smart Software Developer using Virtual Lab Automation
The developer, on the other hand, has multiple options , beyond the ones already listed in part II.
Rewrite the code in a more modular fashion so it is easier to have unit tests
Move from c# to Python to make it easier to write mocks and do sub system testing
Add logs, alerts and assertions so he knows that edge conditions are safely handled
Refactor the code so User Interface validations and server validations use the same mechanism
Add new code with a separate flag\object\screen so it has less chance to have regression on other functionality
Shout at the product manager that the requirements are too complex and there is not way to implement them In SQL with proper testing
Move from simple ASP.NET mode to MVC model so more parts of the UI can be tested separately
Ask QA to help with extensive PRE-COMMIT manual testing as part of the development stage
Ask QA to help with running the automatic testing on development branches
Help the Automated QA team to make sure new features are tested during the development stage and not post deployment
The manager role is:
Iterate over and over the concept of ownership, proof and responsibility
Back the theory with resources – buy machines for testing, software for code checking etc
For example, buying two servers for the clustering team so they can test their code actually runs on a cluster
Help to manage trade-offs and real world considerations
For example, which functionality is used a lot and which is hardly used
Pay the “price” for making higher quality code
For example, Pay $50,000 for a new automated testing project
Avoid being dogmatic in the specific methodology
For example, unit testing might not be effective in certain places and forcing everyone to do them will just create resentment
Introduce and promote new technologies such as Virtualization and lab automation
Help apply the right methods in the right context
To summarize, like any other professional, the developer is the one responsible for the quality of his or her work.Allowing them to push unproven code to customers is what gave us bad reputation as an industry.However, the best ones are able not just to code, but also to analyze the risk, check for validity ,rewrite and design to create bullet proof products.
And if you read so far, here is a reminder to a lovely 80’s song.
For any developer that ever spent three months looking for a memory leak, Purify was a godsend.
Its main advantage – the technology actually worked.
It is amazing to see that even in .Net environment we still keep having memory leak problems, but no Purify to the rescue. Even more amazing is that the inventor of Purify is also the inventor and founder of NetFlix.
The mother of everything virtualized. It hinted of the future. It was the first “emulation” software that actually worked.
Freezing code in time, replaying bugs, running SoftIce inside a VM , running Linux on windows, running Linux on windows, multiple servers on a single machine. All of that for $300 ( $240, if you bought many of them ).
In Part One I examined some myths about hardware and software appliance. Today I’ll try to describe why hardware appliances became so successful in last years and where.
The basics ideas come from a great NetApp pitch I heard in 1994, when they were very small.Their example at the time was “Routing was done by generic Sun\IBM\HP\Digitital Computers and Cisco turned it into Appliance”. The analogy was “File Serving is done by generic Sun servers and NetApp is going to be the Filer Appliance”, which they did.
Appliances can be great because:
Appliances can be cheaper than PC – creating a 60$ Small office router is just not possible using PC hardware components. Even $1000 enterprise branch office is better of using cheap CPU and low memory to achieve a great margin.
Appliances are much easier to install – this is probably still true. Having someone else tie together all the software , do the hardening, remove extra bits and having no drivers to deal with is a great win. Installing the right RAID driver for a generic Linux system can still be quite challenging.
Appliances can have better performance for dedicated tasks- NetApp favorite example was trying to list 2000 files in a big directory .It could take several minutes in a generic Unix file system. Since NetApp designed the operating system just for file serving it was done amazingly fast.
Appliances can have a much better form factor - It is quite hard to put 12 Network cards in a single PC.To populate it with 40 is just impossible. Moreover, the network cards on x86 servers are in the wrong side ! Network equipment makers place the cards in the front , while generic servers have them in the back. Again, it seems like a small thing, but try to get Dell,HP or IBM to change that for your appliance.
The right side of the cable
Appliances are not managed by the server group - one of the biggest selling points for network departments is that the server group can not touch dedicated operating systems. If a Firewall admin buys a Linux server she has to conform to the Linux guidance and dictatorship of the server admins.If it PimiPimiOS , they have no say about it.
Appliances are more secure - this is true to some extent just because the functionality is limited and no extra services are installed. However, in many cases it may boil down to security by obscurity. Nobody bothers to update their appliances with latest security patches and the proprietary operating system are not inspected by the community. Furthermore, security applications can not be run on these unique environments.
Appliances boot faster - seems like a small thing, but waiting ten minutes for windows to load is not really acceptable for an enterprise grade router or file server. It is also quite annoying in your home DSL modem. Actually it is quite annoying on my $2000 ThinkPad. Anyway, having a very small ,optimized OS and no hard disk allows a very fast boot time, along with dedicated thinking about boot and reboot length.
Appliances are more reliable because they have no hard disk (“moving parts”) - maybe , not so sure about this one. Anyway , in few years no server will have any moving part ( although it seems fans are moving all the time … )
Appliance have a superior , dedicated management console - this is commonly true. Good appliances have a a great unified web and command line management that bundles all management aspects from image management to application configuration. The problem is once you have 30 different appliances from different vendors each with its own dedicated ChuChuOs. On a side note, it tends to be quite hard to script and program these beasts , for the same reason.
To make the discussion more interactive till i post the third piece here is a small poll to get your feedback.
“An issue has been uncovered with ESX/ESXi 3.5 Update 2 that causes the product license to expire on August 12. VMware will reissue the binaries in the next 36 hours (by August 13, PST). Until the issue is resolved, we advise against upgrading to ESX/ESXi 3.5 Update 2.”