Software Testing

Recording and Slides From Today's Webinar on Decision Tables

Randy Rice's Software Testing & Quality - Thu, 08/11/2016 - 13:03
Thanks to everyone that attended today's webinar on decision tables. For those that could not get in
due to capacity limits, I apologize.

However, here are the slides:
http://www.riceconsulting.com/public_pdf/Webinar_Decision_Tables.pdf

And here is the recording:
https://youtu.be/z5RlCBKxfF4

I am happy to answer any questions by e-mail, phone or Skype. If you want to arrange a session, my contact info is on the final slide.

Thanks again,

Randy
Categories: Software Testing

Will Code for Food

QA Hates You - Wed, 08/10/2016 - 07:15

Literally. I saw this in the back of Ozark Farm and Neighbor magazine:

At the very cheapest, a domain registration + a year of simple hosting with domain purchase + use of templates and standard copy means that any beef above a couple of steaks is pure profit.

Categories: Software Testing

As A Wise Man Once Said….

QA Hates You - Wed, 07/27/2016 - 08:28
Categories: Software Testing

QA Music: They’ve Come To Snuff The Testing

QA Hates You - Mon, 07/25/2016 - 04:11

You know it ain’t gonna die.

Alice in Chains, “The Rooster”:

Categories: Software Testing

The A Word returns (sort of)

Alan Page - Thu, 07/14/2016 - 21:29

Chris McMahon, who has always impressed me with his words and his wit called me out in his blog.

Specifically:

Apropos of my criticism of “Context Driven Approach to Automation in Testing” (I reviewed version 1.04), I ask you to join me in condemning publicly both the tone and the substance of that paper.

Almost exactly a year ago, I reviewed a draft of the paper, and my name is among those listed as reviewing the paper. The feedback I gave was largely editorial (typos and flow), with a few comments about approach that I’ll repeat here:

The first red flag I called out (and will call out again here) is the phrase that describes test automation as something to “automate testing by automating the user”. This is a shallow view of test automation, but other than comment, I didn’t push hard on it. In hindsight, this was a mistake (much of The A Word touches on this topic).

In regards to the scenario the authors chose for scenario automation, I thought the choice was weird, and asked for more…context and provided some food for thought.

I think you can add even more emphasis to how and why you chose to automate scene creation. Too many times, testers choose to automate because they can automate. The thought process (for me) may be, “The scene feature is pretty important, I’m curious what happens if an author has thousands of scenes. Will it cause performance problems, formatting problems, file load problems, or other issues, etc. There’s no way in hell I want to do this manually, so let me write some code to help me run my experiment”.  You may also want to discuss alternate implementation ideas – e.g. creating a macro in Notepad++ to create the text then paste it in, or creating a macro in Word for the same, or using for in a windows console (e.g. for /L %f in (1,1,1000) do echo “##”>>output.txt&echo “Scene %f”>>output.txt ). Using tools for creating and manipulating data could be a whole article.

And that led to my real beef with the paper. It talks about using tools to test – which can be a good thing, but it doesn’t really talk about automation in the way successful teams actually use it.

I think it may be important to talk about purposes of automation and where to apply it – or at least one context of that – as I don’t think that’s discussed enough (but I’ve made a note to myself to write a blog on this very subject). At a BFC (big company) like Microsoft, we write a lot of shared / distributed automation – automated tests that we need to run on a lot of different hardware/platform configurations in some sort of lab setting (tools like SauceLabs or BrowserStack are helpful here). Web apps are famous for this [problem].

I also commented:

The other kind of automation (which you cover in your article) is what I sometimes call exploratory automation (or more often, just “Testing”). This is where we get a test idea, and want to write some quick automation to help learn about the product. While we may turn this sort of test into something that’s distributed / shared someday, its primary purpose is to help me answer questions and learn. There’s (another) story in HWTSAM where I described a case of this. I wrote really ugly brute force automation in C (using things like FindWindow and SendMessage(LBUTTON_DOWN…) to simulate opening and closing a connection to a remote host many times (the only thing this app did). It found a nice memory leak that may not have been found otherwise (or at least not as quickly).

All of this feedback fed my uber-point, which was that while the article talked about test automation, the examples really just talked about using somewhat random tools to help the authors explore or test some software. There was nothing about strategy, or about more typical use of test automation. I asked about it in a comment:

I wonder why you don’t use the word “Tools” in the title – e.g. “A CDT approach to tools and automation in testing” or something like that.

…because the paper is about, as I said above, using non-standard tools to help test. Sure, it’s automation in a sense, but nothing in the paper reflects the way test automation is used successfully in thousands of successful products.

All that said, I do not support this paper as a description of good test automation, and I think it’s an inappropriate method for anyone to learn about how to write automation. Chris requested that the authors remove the paper; while I support this, and do believe that the paper can cause more harm than good, there’s so much bad advice on the internet about creating software, that removing this one piece of bad advice will hardly make a dent.

I did not realize my name was listed as a reviewer, and although I did (as admitted above) review this paper, I do not want my name associated with it, and will request that the authors remove my name.

(potentially) related posts:
  1. Last Word on the A Word
  2. Coding, Testing, and the “A” Word
  3. Automation & Test Cases
Categories: Software Testing

Keeping Your Test Data Out Of Production. Also, Your Production Data.

QA Hates You - Thu, 07/14/2016 - 04:39

There’s a right way and a wrong way to keep test data out of production. Citigroup chose the wrong way:

It turned out that the error was a result of how the company introduced new alphanumeric branch codes.

When the system was introduced in the mid-1990s, the program code filtered out any transactions that were given three-digit branch codes from 089 to 100 and used those prefixes for testing purposes.

But in 1998, the company started using alphanumeric branch codes as it expanded its business. Among them were the codes 10B, 10C and so on, which the system treated as being within the excluded range, and so their transactions were removed from any reports sent to the SEC.

The SEC routinely sends requests to financial institutions asking them to send all details on transactions between specific dates as a way of checking that nothing untoward is going on. The coding error had resulted in Citigroup failing to send information on 26,810 transactions in over 2,300 such requests.

Citigroup was fined $7,000,000 for the problem which probably stemmed from a lack of communication.

Categories: Software Testing

When You Hide The Interface for Functionality

QA Hates You - Thu, 07/07/2016 - 07:38

You know when your company wires off some part of the interface because the functionality is incomplete or not ready for the release?

Yeah, it’s like that.

It, too, is a risky maneuver as it’s generally a last minute decision, which doesn’t leave you a lot of time to test to ensure it’s wired off completely in all areas where the user would encounter it.

Categories: Software Testing

Lessons Learned in Test Automation Through Sudoku

Randy Rice's Software Testing & Quality - Wed, 07/06/2016 - 12:39
For many years, I have recommended Sudoku as a mind training game for testers. I think Sudoku requires some of the same thinking skills that testers need, such as the ability to eliminate invalid possibilities, deduce correct answers and so forth.

Much like an application that may lack documentation, Sudoku only gives you a partial solution and you have to fill in the rest. Unlike software, however, guessing actually prevents you from solving the puzzle - mainly because you don't see the impact of an incorrect guess until it is too late to change it.

My friend, Frank Rowland, developed an Excel spreadsheet some time ago, that he used to help solve Sudoku puzzles by adding macros to identify cells that had only one possible correct value. At the time, I thought that was pretty cool. Of course, you still had to enter one value at a time manually. But I thought it was a good example of using a degree of automation to solve a problem.

Fast forward to last week. I was having lunch with Frank and he whips out his notebook PC and shows me the newest version of the spreadsheet. After listening to a math lecture from The Great Courses, he learned some new approaches for solving Sudoku.

Armed with this new information, Frank was successful in practically automating the solution of a Sudoku puzzle. I say "practically" because at times, some human intervention is required.

Now, I think the spreadsheet is very cool and I think that the approach used to solve the puzzle can also be applied to test automation. The twist is that the automation is not pre-determined as far as the numeric values are concerned. The numbers are derived totally dynamically.

Contrast this with traditional test automation. In the traditional approach to test automation (even keyword-driven), you would be able to place numbers in the cells, but only in a repeatable way - not a dynamic way.

In Franks's approach, the actions are determined based on the previous actions and outcomes. For example, when a block of nine cells are filled, that drives the possible values in related cells. The macros in this case know how to deduce the other possibilities and also can eliminate the invalid possibilities. In this case of "Man vs. Machine", the machine wins big time.

I don't have all the answers and I know that work has been done by others to create this type of dynamic test automation. I just want to present the example and would love to hear your experiences in similar efforts.

I think the traditional view of test automation is limited and fragile. It has helped in establishing repeatable tests for some applications, but the problem is that most applications are too dynamic. This causes the automation to fail. At that point, people often get frustrated and give up.

I've been working with test automation since 1989, so I have seen a lot of ideas come down the pike. I really like the possibilities of test automation with AI.

I hope to get a video posted soon to show more about how this works. Once again, I would also love to hear your feedback.  


Categories: Software Testing

ASTQB Mobile Tester Certification - Live Virtual Classes in August and September, 2016

Randy Rice's Software Testing & Quality - Fri, 07/01/2016 - 09:58
According to a very recent survey published by Techwell, only 25% of respondents felt they had the knowledge/skills and tools needed for testing mobile applications!

That needs to change and I have just the way to do it.
I am launching a new schedule of live virtual training courses for this new certification from the ASTQB, starting August 9 – 11. 
I have been holding off offering this course as a live virtual course until I was 100% positive you would feel fully engaged while taking the course. I have developed a technique that I feel will keep you engaged in the material and prepare you for the exam. (By the way, the exam is not included in the cost of this course but you can add the $150 exam to your registration.)

Also, let me encourage you to keep building new skills in testing. Mobile testing is a great specialty area and it is also a great skill set to have in your career. The time to learn mobile testing is NOW. Don't wait until your employer is looking to build that special team...or (and I hope this never happens in bad way, but...) if you are looking for employment.
I will be the instructor for all presentations of this course. I am a co-author of the ASTQB Mobile Tester syllabus and author of this course. This course is fully accredited by the ASTQB. And...I have been teaching mobile testing since 2001!  You get personal access to me throughout the course and after the course to ask any questions.
This course covers all aspects of mobile testing. More about the live virtual classes can be found here:http://www.riceconsulting.com/home/index.php/Mobile-Testing/astqb-certified-mobile-tester-live-virtual-course.html
and the course brochure is here:http://www.riceconsulting.com/home/index.php/Mobile-Testing/testing-mobile-applications-astqb-certification-course.html
In the most recent ASTQB newsletter that was published on Tuesday, I have an offer of 15% off any ASTQB Mobile Testing course – live or e-learning. Just use promo code “MOBILE15” when paying for your registration. This offer is only good through August 15. For more details on this course or to register, please visit http://www.riceconsulting.com/home/index.php/Mobile-Testing/astqb-certified-mobile-tester-live-virtual-course.html
If you want to learn more about the ASTQB Mobile Tester certification, just go to:https://www.astqb.org/get-certified/steps-to-certification/
I hope to see you there!
Randy
Categories: Software Testing

Fantaztic

QA Hates You - Wed, 06/29/2016 - 08:50

After ordering a video game from Amazon to-day, I received an email with an offer:

Spelling the product name right two out of three times ain’t bad. It’s worse.

Remember, every time you spell something wrong in a marketing email, you’re making it easier for the phishers.

Categories: Software Testing

And Sometimes Ends With A

QA Hates You - Wed, 06/22/2016 - 05:57

Security Starts at the POS.

In this case, POS means Point of Sale.

However, not everyone is familiar with the acronyms and argot you are, so be careful when using them without explaining them first. This applies to your interfaces as well as your written work.

Categories: Software Testing

Another Branding Failure

QA Hates You - Tue, 06/21/2016 - 04:53

A couple weeks ago, I pointed out the some flaws with inconsistent application of the trademark symbol. Today, we’re going to look at a failure of branding in a news story.

Can you spot the branding failure in this story?

After the refi boom, can Quicken keep rocketing higher?:

Quicken Loans Inc, once an obscure online mortgage player, seized on the refinancing boom to become the nation’s third largest mortgage lender, behind only Wells Fargo & Co and JPMorgan Chase & Co.

Now, with the refi market saturated, Quicken faces a pivotal challenge — convincing home buyers to trust that emotional transaction to a website instead of the banker next door.

Okay, can anyone not named Hilary spot the problem?

Quicken Loans and Quicken are two different things and have been owned by two different companies since 2002. For fourteen years.

Me, I know the difference because earlier this year I did some testing on a Quicken Loans promotion, and the developers put simply Quicken into some of the legalesque opt-in and Terms of Service check boxes. So I researched it. And then made them use Quicken Loans in the labels instead.

After reading the story, I reached out to someone at Quicken Loans to see if they use “Quicken” internally informally, and she said $&#&^$! yes (I’m paraphrasing here to maintain her reputation). So maybe the journalist had some communication with internal people who used “Quicken” instead of the company name, or perhaps that’s what everybody but me does.

However, informal nomenclature aside, Quicken Loans != Quicken, and to refer to it as such could have consequences. If this story hit the wires and Intuit’s stock dropped a bunch, ay! Or something more sinister, which in this case means unintended and unforeseen consequences.

My point is to take a little time to research the approved use of trademarks, brand names, and company names before you start testing or writing about them. Don’t trust the developers (or journalists, apparently) to have done this for you.

Categories: Software Testing

QA Music: Where The Wild Things Are Running

QA Hates You - Mon, 06/20/2016 - 04:21

Against the Current, “Running With The Wild Things”:

I like the sound of them; I’m going to pick up their CD.

When the last CD is sold in this country, you know who’ll buy it. Me.

(Link via.)

Categories: Software Testing

Good Reasons NOT to Log Bugs

Eric Jacobson's Software Testing Blog - Thu, 06/16/2016 - 13:14

I noticed one of our development teams was creating new Jira Issues for each bug found during the development cycle.  IMO, this is an antipattern. 

These are the problems it can create, that I can think of:

  • New Jira Issues (bug reports) are creating unneccessry admin work for the whole team. 
    • We see these bug reports cluttering an Agile board.
    • They may have to get prioritized.
    • We have to track them, they have to get assigned, change statuses, get linked, maybe even estimated.
    • They take time to create. 
    • They may cause us to communicate via text rather than conversation.
  • Bug reports mislead lazy people into tracking progress, quality, or team performance by counting bugs.
  • It leads to confusion about how to manage the User Story.  If the User Story is done except for the open bug reports, can we mark the User Story “Done”?  Or do we need to keep the User Story open until the logged bugs get fixed…”Why is this User Story still in progress?  Oh yeah, it’s because of those linked logged bugs”.
  • It’s an indication our acceptance criteria is inadequete.  That is to say, if the acceptance criteria in the User Story is not met, we wouldn’t have to log a bug report.  We would merely NOT mark the Story “Done”.
  • Bug reports may give us an excuse not to fix all bugs…”let’s fix it next Sprint”, “let’s put it on the Product Backlog and fix it some other day”…which means never.
  • It’s probably a sign the team is breaking development into a coding phase and a testing phase.  Instead, we really want the testing and programming to take place in one phase...development. 
  • It probably means the programmer is considering their code “done”, throwing it over the wall to a tester, and moving on to a different Story.  This misleads us on progress.  Untested is as good as nothing.

If the bug is an escape, if it occurs in production.  It’s probably a good idea to log it.

Categories: Software Testing

Shorten the Feedback Loop, Unless…

Eric Jacobson's Software Testing Blog - Wed, 06/15/2016 - 07:59

On a production support kanban development team, a process dilema came up.  In the case where something needs to be tested by a tester:

  1. Should the tester perform the testing first in a development environment, then in a production-like environment after the thing-under-test has been packaged and deployed?  Note: in this case, the package/deploy process is handled semi-manually by two separate teams, so there is a delay.
  2. Or, should the tester perform all the testing in a production-like environment after the thing-under-test has been packaged and deployed?

Advantage of scenario 1 above:

  • Dev environment testing shortens the feedback loop.  This would be deep testing.  If problems surface they would be quicker and less risky to fix.  The post-package testing would be shallow testing, answering questions like: did the stuff I deep tested get deployed properly?

Advantage of scenario 2 above:

  • Knock out the testing in one environment.  The deep testing will indirectly cover the package/deployment testing.

From the surface, scenario 2 looks better because it only requires one testing chunk, NOT two chunks separated by a lengthy gap.  But what happens if a problem surfaces in scenario 2?  Now we must go through two lengthy gaps.  How about a third problem?  Three gaps.  And so on.

My conclusion: Scenario 1 is better unless this type of thing-under-test is easy and has a history of zero problems.

Categories: Software Testing

An Oldie, But An Oldie

QA Hates You - Tue, 06/14/2016 - 03:53

Round round work around
I work around
Yeah
work around round round I work around
I work around
work around round round I work around
From job to job
work around round round I work around
It’s a real cool app
work around round round I work around
Please don’t make it snap

I’ve got little bugs runnin’ in and out of the code
Don’t type an int or it will implode

My buttons don’t click, the users all moan
Yeah, the GUIS are buggy but the issues are known

I work around
work around round round I work around
From town to town
work around round round I work around
It’s a real cool app
work around round round I work around
Please don’t make it snap
work around round round I work around
I work around
Round
work around round round oooo
Wah wa ooo
Wah wa ooo
Wah wa ooo

We always make a patch cause the clients get mad
And we’ve never missed a deadline, so it isn’t so bad

None of the data gets checked cause it doesn’t work right
We can run a batch job in the middle of the night

I work around
work around round round I work around
From job to job
work around round round I work around
It’s a real cool app
work around round round I work around
Please don’t make it snap
work around round round I work around
I work around
Round
Ah ah ah ah ah ah ah ah

Round round work around
I work around
Yeah
work around round round I work around
work around round round I work around
Wah wa ooo
work around round round I work around
Oooo ooo ooo
work around round round I work around
Ahh ooo ooo
work around round round I work around
Ahh ooo ooo
work around round round I work around
Ahh ooo ooo

I don’t want to make you feel old, old man, but most of your co-workers don’t remember “Kokomo” much less “I Get Around” and probably think the Beach Boys were the guys on Jersey Shore

Categories: Software Testing

QA Music: Fire, Fire

QA Hates You - Mon, 06/13/2016 - 04:26

Puscifer, “The Arsonist”

Few songs use the word “deleterious.” Too few, if you ask me.

Categories: Software Testing

Pages