I have this dream to get my own testing consultancy going in few years. Find good thinkers, find clients in need of smart testing, do the right thing... nothing fancy, just good ol' testing. There are plenty of challenges ahead of me on the way to this dream (it's not even a SMART goal yet), of course, but from time to time I like to think about little details as if I was already there.
Today I got to this: how would I organize payment system in my dream company?
I've been in the industry for almost 11 years now, and everywhere I see more or less similar picture:
- people get hired from outside - vs promoting capable people on the inside;
- a newly hired specialist gets higher salary that someone on a similar position who's already working for the company;
- companies don't work hard enough to keep their people as they consider them easily replaceable.
Now, I don't know about you, but to me it seems unfair and straight dumb. IT is an industry mostly getting its profit from innovation, and people are the basic material making that profit possible. People are important. Software company is not a factory! It can never work as a factory for a very simple reason: factories deal with repetitive tasks, whereas software development always deals with new tasks in a constantly changing environment.
An equally simple fact follows: if you want to be successful, you cannot treat people who work for you as replaceable parts of the system. It means both high expectations and high rewards.
So, going back to the initial problem... this is what I'm thinking to apply to my future business:
- a newly hired specialist will get salary not higher than someone already working in the company who is doing the same job. Which automatically means that salaries for anyone inside the company are at least on par with the market;
- all salary levels will be transparent as well as requirements for moving between those levels;
- structure of profits and spends will be available for anyone inside the company to see and comment on;
- when we need someone to fill a position inside the company with the higher level of competence than there already exists, we would first look for anyone who is ready to upskill and take it as a challenge inside the company.
I strongly believe that if a company wants loyalty, it has to be loyal to its employees first. And it also makes perfect commercial sense to me: the more happy, motivated and high qualified people I gather - the more efficient and high quality work we can do, the more happy clients will be, the more clients will want us. Profit!
So, does anyone think my payment rules would work? Do you think it's fair? Would you like to work in a company with rules like these? Do not hesitate to tell me if you think that's an awful idea.
Wednesday, 28 May 2014
Thursday, 8 May 2014
Lessons learned in non-functional testing
Yesterday I gave a lightning talk at my favorite WeTest Auckland meetup hosted by Assurity. It was a five minutes (okay, maybe 6 minutes) long presentation with the 15 minutes of discussion afterwards. So, for five minutes I had to make it really short, yet making sense.
The main idea I was trying to express is really simple: plan for non-functional quality characteristics right from the beginning.
As an IT graduate I had this assumption when I first started to work as a software tester, that developers would do proper technical and database design according to the task at hand, and then perform testing of the product from the very beginning. In other words, I thought it was obvious that someone would start testing as soon as you get your first requirements from a customer. In reality it wasn't like this at all. Testing usually started at the latest stages of development, which cost projects lots of work hours, effort and money, and often reputational losts. As years passed by, situation seemed to improved. Nowadays most IT companies accept that testing is important, and that it should begin as soon as possible. With one "but" - it only applies to functional testing.
When I switched from functional to performance testing last September, I didn't have that naive assumption about functional testing beeing done the right way everywhere... turns out, I still had even more naive assumption that non-functional testing is done the right way. I haven't done any serious programming in 7 years, yet when occasionally I think about implementing this or that, I still operate by the algorithm I learned in uni: I ask myself questions about a problem I'm trying to solve, and about the future of the software that would solve it. I thought it was obvious: when you write e.g. a web application, you think about bunch of stuff on top of functionality: how would it scale, how would you plug in new functionality, how would you localize it, how would you customize it, what questions will users ask about it, how would you test it, and so on. Sometimes most of the answers would be "it doesn't matter because..." - but important thing is to ask questions and know that for sure.
Well, in real life it happens rarely. I'm not talking about companies like Google or Amazon - surely those guys know what they are doing if only because they had enough experience in the area. But most companies aren't Google. What I see in real life is fire fighting: application is created, then it goes into production, and then you have problems in production, which are extremely expensive to fix. Some issues are obvious: e.g. security breaches and performance and/or scalability breakdowns. Some issues are more subtle: e.g. bad supportability would mean a lot of time and effort spent by field agents (support line, or operators, or whatever you call people who talk with unhappy users). Bad maintainability would mean additional problems when you need to install updates or reconfigure something in live system. Bad testability would mean more time necessary for testing e.g. new release, and potentialy worse quality because some areas testers won't be able to reach. Bad initial usability will bite you in the ass when you decide to implement shiny new GUI which would be different from the existing one, because users will be resistant to changes (remember MS Office 2003 -> 2007 outcry). And what's even worse, to fix non-functional issue often requires to implement fundamental architecture changes.
To summarize, non-functional quality is important, it costs a lot to fix, it isn't being given enough attention.
So, the question is, if you work in one of those companies, what can you do?
As part of the team, regardless of your role, you can communicate and underline the importance of the following:
The main idea I was trying to express is really simple: plan for non-functional quality characteristics right from the beginning.
As an IT graduate I had this assumption when I first started to work as a software tester, that developers would do proper technical and database design according to the task at hand, and then perform testing of the product from the very beginning. In other words, I thought it was obvious that someone would start testing as soon as you get your first requirements from a customer. In reality it wasn't like this at all. Testing usually started at the latest stages of development, which cost projects lots of work hours, effort and money, and often reputational losts. As years passed by, situation seemed to improved. Nowadays most IT companies accept that testing is important, and that it should begin as soon as possible. With one "but" - it only applies to functional testing.
When I switched from functional to performance testing last September, I didn't have that naive assumption about functional testing beeing done the right way everywhere... turns out, I still had even more naive assumption that non-functional testing is done the right way. I haven't done any serious programming in 7 years, yet when occasionally I think about implementing this or that, I still operate by the algorithm I learned in uni: I ask myself questions about a problem I'm trying to solve, and about the future of the software that would solve it. I thought it was obvious: when you write e.g. a web application, you think about bunch of stuff on top of functionality: how would it scale, how would you plug in new functionality, how would you localize it, how would you customize it, what questions will users ask about it, how would you test it, and so on. Sometimes most of the answers would be "it doesn't matter because..." - but important thing is to ask questions and know that for sure.
Well, in real life it happens rarely. I'm not talking about companies like Google or Amazon - surely those guys know what they are doing if only because they had enough experience in the area. But most companies aren't Google. What I see in real life is fire fighting: application is created, then it goes into production, and then you have problems in production, which are extremely expensive to fix. Some issues are obvious: e.g. security breaches and performance and/or scalability breakdowns. Some issues are more subtle: e.g. bad supportability would mean a lot of time and effort spent by field agents (support line, or operators, or whatever you call people who talk with unhappy users). Bad maintainability would mean additional problems when you need to install updates or reconfigure something in live system. Bad testability would mean more time necessary for testing e.g. new release, and potentialy worse quality because some areas testers won't be able to reach. Bad initial usability will bite you in the ass when you decide to implement shiny new GUI which would be different from the existing one, because users will be resistant to changes (remember MS Office 2003 -> 2007 outcry). And what's even worse, to fix non-functional issue often requires to implement fundamental architecture changes.
To summarize, non-functional quality is important, it costs a lot to fix, it isn't being given enough attention.
So, the question is, if you work in one of those companies, what can you do?
As part of the team, regardless of your role, you can communicate and underline the importance of the following:
- When doing tech design for a new application or a new feature, think about the future. What would happen with your application in a year? What new functionality it might have? What would be the environment and the load (e.g. number of users per hour)? Are you gonna store any sensitive data? Would you need to create additional applications (e.g. mobile) to coordinate with it? What additional data you might need to store? How would you localize/customize it if necessary? How often would you need to release a new version?
- Do risks analysis. Always do risks analysis - better be prepared for the trouble that might come your way.
- Involve specialists in specific fields if your team lacks those: there are consultancies, that can review tech design, or do security testing, or performance testing, for you.
- Start the actual testing as soon as possible - not only functional testing, but non-functional too. On early stages it would look differently from how it loos on later stages: e.g. it makes little to no sense to do a full performance testing for a half-made release, but it is possible to do performance testing of existing separate components, especially of the core ones. It pays off because the earlier you fix the issue, the cheaper it is.
- Treat non-functional issues fair. If the team doesn't fix e.g. testability issues in time, you will pay for it later with much more complicated testing and higher price for building testability into the project. If performance or security issues are not being treated with respect because "it still works, no one will notice it's slow/insecure", you will pay with your reputation and money when it manifests on production. And if your product is a success, it will manifest in production sooner or later. #too_much_painful_experience
After I did the talk, one of the testers on the meetup asked if all of this makes sense for startups and such - she implied that in the beginning you don't want to spend your time thinking about all this stuff, because you are under pressure to deliver fast. This is another point of view, but I strongly disagree with it. The only case in which you can forget about quality is when you are making proof of concept, then throw it out the window and start anew. If you are gonna use the codebase of your first release further, it pays off greatly to spend few extra hours (and in the very beginning you would only need few extra hours) on analysing the future of your application and doing your tech design (database design and application design) properly.
Remember: there is nothing more permanent than a temporary solution.
Slides for the presentation are shared here:
https://docs.google.com/file/d/0Bxi4eMT3I97ea3luRU9nemRpODg/edit
https://docs.google.com/file/d/0Bxi4eMT3I97ea3luRU9nemRpODg/edit
Subscribe to:
Posts (Atom)