Software Testing

GTAC 2014 Coming to Seattle/Kirkland in October

Google Testing Blog - Thu, 02/19/2015 - 06:22
Posted by Anthony Vallone on behalf of the GTAC Committee

If you're looking for a place to discuss the latest innovations in test automation, then charge your tablets and pack your gumboots - the eighth GTAC (Google Test Automation Conference) will be held on October 28-29, 2014 at Google Kirkland! The Kirkland office is part of the Seattle/Kirkland campus in beautiful Washington state. This campus forms our third largest engineering office in the USA.

GTAC is a periodic conference hosted by Google, bringing together engineers from industry and academia to discuss advances in test automation and the test engineering computer science field. It’s a great opportunity to present, learn, and challenge modern testing technologies and strategies.

You can browse the presentation abstracts, slides, and videos from last year on the GTAC 2013 page.

Stay tuned to this blog and the GTAC website for application information and opportunities to present at GTAC. Subscribing to this blog is the best way to get notified. We're looking forward to seeing you there!

Categories: Software Testing

GTAC 2014: Call for Proposals & Attendance

Google Testing Blog - Thu, 02/19/2015 - 06:21
Posted by Anthony Vallone on behalf of the GTAC Committee

The application process is now open for presentation proposals and attendance for GTAC (Google Test Automation Conference) (see initial announcement) to be held at the Google Kirkland office (near Seattle, WA) on October 28 - 29th, 2014.

GTAC will be streamed live on YouTube again this year, so even if you can’t attend, you’ll be able to watch the conference from your computer.

Presentations are targeted at student, academic, and experienced engineers working on test automation. Full presentations and lightning talks are 45 minutes and 15 minutes respectively. Speakers should be prepared for a question and answer session following their presentation.

For presentation proposals and/or attendance, complete this form. We will be selecting about 300 applicants for the event.

The due date for both presentation and attendance applications is July 28, 2014.

There are no registration fees, and we will send out detailed registration instructions to each invited applicant. Meals will be provided, but speakers and attendees must arrange and pay for their own travel and accommodations.

Update : Our contact email was bouncing - this is now fixed.

Categories: Software Testing

The Deadline to Sign up for GTAC 2014 is Jul 28

Google Testing Blog - Thu, 02/19/2015 - 06:21
Posted by Anthony Vallone on behalf of the GTAC Committee

The deadline to sign up for GTAC 2014 is next Monday, July 28th, 2014. There is a great deal of interest to both attend and speak, and we’ve received many outstanding proposals. However, it’s not too late to add yours for consideration. If you would like to speak or attend, be sure to complete the form by Monday.

We will be making regular updates to our site over the next several weeks, and you can find conference details there:

For those that have already signed up to attend or speak, we will contact you directly in mid August.

Categories: Software Testing

Announcing the GTAC 2014 Agenda

Google Testing Blog - Thu, 02/19/2015 - 06:20
by Anthony Vallone on behalf of the GTAC Committee

We have completed selection and confirmation of all speakers and attendees for GTAC 2014. You can find the detailed agenda at:

Thank you to all who submitted proposals! It was very hard to make selections from so many fantastic submissions.

There was a tremendous amount of interest in GTAC this year with over 1,500 applicants (up from 533 last year) and 194 of those for speaking (up from 88 last year). Unfortunately, our venue only seats 250. However, don’t despair if you did not receive an invitation. Just like last year, anyone can join us via YouTube live streaming. We’ll also be setting up Google Moderator, so remote attendees can get involved in Q&A after each talk. Information about live streaming, Moderator, and other details will be posted on the GTAC site soon and announced here.

Categories: Software Testing

GTAC 2014 is this Week!

Google Testing Blog - Thu, 02/19/2015 - 06:20
by Anthony Vallone on behalf of the GTAC Committee

The eighth GTAC commences on Tuesday at the Google Kirkland office. You can find the latest details on the conference at our site, including speaker profiles.

If you are watching remotely, we'll soon be updating the live stream page with the stream link and a Google Moderator link for remote Q&A.

If you have been selected to attend or speak, be sure to note the updated parking information. Google visitors will use off-site parking and shuttles.

We look forward to connecting with the greater testing community and sharing new advances and ideas.

Categories: Software Testing

10 Rules for Safety-Critical Code

QA Hates You - Thu, 02/19/2015 - 03:00

NASA’s 10 rules for developing safety-critical code:

NASA’s been writing mission-critical software for space exploration for decades, and now the organization is turning those guidelines into a coding standard for the software development industry.

The NASA Jet Propulsion Laboratory’s (JPL) Laboratory for Reliable Software recently published a set of code guidelines, “The Power of Ten—Rules for Developing Safety Critical Code.” The paper’s author, JPL lead scientist Gerard J. Holzmann, explained that the mass of existing coding guidelines is inconsistent and full of arbitrary rules, rarely allowing for now-essential tasks such as tool-based compliance checks. Existing guidelines, he said, inundate coders with vague rules, causing code quality of even the most critical applications to suffer.

It’s not The Programmer’s Book of Rules, but it’s worth reading and considering even if your software can’t kill people.

Categories: Software Testing

This Week in Web Performance

LoadStorm - Wed, 02/18/2015 - 11:34

There were a lot of interesting things going on in the web performance world this week! New Google tools and developer news made headlines, as well as the Ocean’s Eleven of cyber crime.

Cyber criminals make out with one billion dollars from more than 100 banks

In possibly one of the greatest bank heists ever, hackers installed spyware on the computers of over 100 different financial institutions in 25 different countries, including the US. By mimicking the regular bank workflows, they were able to transfer between $2.5 and $10 million from each bank into their own accounts. The Ocean’s Eleven of cyber attacks is suspected to have gone on for ever two years and isn’t over yet. Kaspersky labs say the “attacks remain active”, naming the malware “Carbanak”.

Kaspersky Labs uncovers the most advanced malware publisher they’ve ever seen

Kaspersky Lab recently released a report on a highly sophisticated threat actor they named the Equation group. The group uses multiple techniques to infect victims, ranging from the internet to USB drives. Kaspersky noted the malware’s complexity as “the most advanced threat actor they have ever seen”, also able to infect hard drive firmware. Kaspersky also stated that they believed that the Equation group developers may be the same creators of Stuxnet, a worm discovered in 2010 believed to be the US government.

Google adds grace period for developers to fix security flaws

Project Zero, a Google team that aims to significantly reduce the number of people harmed by targeted attacks, gives vendors 90 days to patch security flaws. After that, the company automatically discloses the reported vulnerabilities, regardless of whether a fix is available. Project Zero has adhered to a 90-day deadline, but in order to enforce consistent deadlines for edge cases, a few changes have been added. Deadlines that end on weekends or holidays will move to the next normal work day, a 14-day grace period can be added, and CVE identifiers, an industry standard for identifying vulnerabilities, will be pre-assigned. The rest of Google will now be following the same policy.

Google announces a new way to evaluate cloud performance

How do you benchmark cloud providers? Google cloud users were having trouble evaluating the performance of cloud offerings too. One week ago today, Google released a free, open source cloud performance benchmarking framework called PerfKit to do just that. Working with over 30 different researchers, companies, and customers, the Perkit Explorer was made to help you interpret results. In addition to reporting on the most standard metrics of peak performance, it measures the end to end time to provision resources in the cloud. You can try this out new tool out yourself with a free Google Cloud Platform now. I definitely will be.

Also in Google news, this week Google Maps is celebrating their 10 year anniversary! Happy anniversary, Google Maps!

A look ahead : The FCC to vote on net neutrality

The FCC will vote on net neutrality regulations during it’s meeting on February 26th. Tom Wheeler has said he will circulate proposed new rules to preserve the internet as an open platform to the FCC, and the vote on whether or not to classify the internet as a public utility will commence at the February 26th meeting.

Hear of any other interesting web or tech news this week? Let us know in a comment!

The post This Week in Web Performance appeared first on LoadStorm.

A Defect In The Style Guide

QA Hates You - Wed, 02/18/2015 - 05:42

The Wall Street Journal changes its capitalization of eBay depending upon where it appears.

For example, look at the print edition:

Or this article online: Carl Icahn Boosts Stake in EBay:

Carl Icahn slightly boosted his holdings in eBay Inc. by about $25 million in the fourth quarter, according to the activist investor’s latest quarterly filing.

Let’s look at the discrepancies and inconsistencies:

  • In the print edition headlines and drop quotes, only the E is capitalized.
  • In the online story, both the E and the B are capitalized in headlines.
  • When it appears at the beginning of a sentence, both the E and B are capitalized.
  • When the word appears in the middle of the sentence, it is appropriately capitalized as eBay.

This is a style guide issue, as the inconsistent capitalizations are consistent in where they’re inconsistently capitalized.

This doesn’t seem to be a bigger issue regarding trademark capitalization, as iPhone is capitalized correctly, at least online: ‘Staggering’ iPhone Demand Helps Lift Apple’s Quarterly Profit by 38%:

Apple Inc. surpassed even the most bullish Wall Street expectations for its holiday quarter with an improbable trifecta: selling more iPhones at higher prices—and earning more on each sale.

It looks as though the corporate style guide could use a little correction. How about yours?

Categories: Software Testing

Quick Reference: Fifty Quick Ideas To Improve Your User Stories

The Quest for Software++ - Wed, 02/18/2015 - 03:55

For teams that need a bit of inspiration during user story refinement workshops, here is a quick reference online mind map with all the ideas from the Fifty Quick Ideas To Improve Your User Stories. The mind map contains a short description and a reminder image for each idea, grouped into categories. Just tap/click a category to open up the details of ideas.

If you like this but you’d prefer something physical rather than electronic, we partnered with DriveThruCards to create a poker-style card deck with all idea summaries. For more info, see

Very Short Blog Posts (25): Testers Don’t Break the Software

DevelopSense - Michael Bolton - Tue, 02/17/2015 - 21:28
Plenty of testers claim that they break the software. They don’t really do that, of course. Software doesn’t break; it simply does what it has been designed and coded to do, for better or for worse. Testers investigate systems, looking at what the system does; discovering and reporting on where and how the software is […]
Categories: Software Testing

QA Music: Showing You Where Forever Dies

QA Hates You - Mon, 02/16/2015 - 03:22

Breaking Benjamin, “I Will Not Bow”

Categories: Software Testing

Give Us Back Our Testing

DevelopSense - Michael Bolton - Sat, 02/14/2015 - 17:47
“Program testing involves the execution of a program over sample test data followed by analysis of the output. Different kinds of test output can be generated. It may consist of final values of program output variables or of intermediate traces of selected variables. It may also consist of timing information, as in real time systems. […]
Categories: Software Testing

Wrong, In Context

QA Hates You - Fri, 02/13/2015 - 02:54

On the front page of the September 9, 2014, Wall Street Journal, we find that GE has exited the kitchen with the sale of its appliance business:

However, on page B3 the same day, the drop quote in an article would seem to indicate otherwise:

Of course, General Electric is not buying the food company Annie’s; the headline makes clear that the General in this case is General Mills (stock symbol: GIS).

But there’s nothing in the drop-quote to indicate something is wrong within its own context. Maybe General Electric often pays premiums like that during an acquisition. Maybe the copy editor or whomever did this dropquote finished the GE story from page 1 just minutes before working on the General Mills story.

However, we’ve got to retain context when testing and proofreading.

Where does this come into play in testing?

The foremost example in my mind is when we’re doing things to trigger error conditions to make sure that an error message displays. It’s possible that the system will throw up the wrong error message and we’ll miss it. I once wrote automated tests that triggered error conditions and parsed the error message (mostly to make sure an error message applicable to the screen and operation displayed). However, I did not write it smart enough to compare the error message that displayed to the error message expected. So when the application started failing by displaying the wrong message for the occasion, the tests didn’t catch it.

So you’ve got to remember to see the forest and the trees–along with the underbrush, the soil, the other flora, and the carnivorous fauna–when you’re testing.

Categories: Software Testing

Load Testing Mobile Apps

LoadStorm - Thu, 02/12/2015 - 10:23

As mobile app usage increases, load testing mobile apps is becoming a key part of the software development lifecycle to make sure your application is ready for traffic. If your mobile app interacts with your application server via REST or SOAP API calls, then LoadStorm PRO can load test mobile app servers like yours.

LoadStorm PRO uses HTTP archive (HAR) recordings to simulate traffic, and we normally create these recordings using a browser’s developer tools on the network tab to record all requests sent to the target server. In this article, I’m going to introduce you to two ways to make recordings of user traffic that can be used in LoadStorm. The first involves packet capturing from your mobile device, and the second involves a Chrome app called Postman used in combination with the Chrome developer tools.

Simulate traffic by using Packet Capturing

If you like to keep things simple, then this method should save you the trouble of manually creating requests as shown in the next method. You’ll need a packet capturing mobile app (such as tPacketCapture) that stores requests in the PCAP file format, and allows you to share the PCAP file. The PCAP file generated by the app can then be converted to a HAR file to be uploaded into LoadStorm for use in a load test. To do this, you can use PCAP Web Performance Analyzer for free without needing any setup, or you can install your own converter from the pcap2har project on GitHub.

For Android devices, this method works as follows:

  1. Install tPacketCapture or other mobile app for capturing packets.
  2. Close any other apps that you have running to avoid unnecessary packets.
  3. Open the packet capturing app and click the Capture button.
  4. Open the mobile app you wish to record, and begin emulating user behavior.
  5. When you’re done emulating user behavior, swipe your notifications area open to click the “VPN is activated by tPack..” message.
  6. Click the Disconnect button to stop capturing packets.
  7. Switch back to the tPacketCapture app to open the File list tab.
  8. Select the PCAP file you just created, and share it using email (or method of your choice).
  9. Convert the PCAP into a HAR using the PCAP Web Performance Analyzer or your own pcap2har converter.
  10. Upload the HAR to LoadStorm.

Click to Zoom

For iOS devices:

At this time, packet capturing mobile apps are only offered on Android devices. Apple products do not support direct packet capture services. However, if you hook up your iOS device to a Mac via USB then you can use OS X supported software to capture the packets, as described on the Apple developer site.

Record requests made in Postman using Chrome developer tools

To make a recording with the Postman app, follow these steps:

  1. From the Chrome Web Store, you can install Postman by rickreation, and the Postman launcher.
  2. Click the Postman launcher icon at the top-right of the browser.
  3. Open the developer tools by right-clicking the page you’re on, and selecting Inspect Element.
  4. Switch to the Network tab in the developer tools and check the Preserve log box.
  5. Create your GET or POST request.
  6. Click the Send button and observe the POST request in your developer tools.
    Note: Extra requests generated by Postman won’t appear in LoadStorm.
  7. Repeat steps 5 and 6 as needed to mimic user behavior.
  8. Right-click in the network log and choose Save as HAR with Content.

Click to Zoom


Making RESTful GET or POST requests will rely heavily on your knowledge of how your mobile app interacts with the application server. If you have a network log that shows the incoming requests from the mobile application, that can help simplify the reconstruction of those requests in the Postman app. Postman actually offers 11 methods for interacting with an application server, and of these we’ll typically only need to use GET and POST requests:

  • GET – For a RESTful request to interact via a GET request you’ll usually need to include some parameter(s) as a query string that tells the application server what you would like returned. For example, you could request to see your all of your tweets on your twitter timeline.
  • POST – POSTs are used to give the target application new information like adding a tweet to your twitter account from your phone. The content that you would like to POST is sent via the request payload in one of three options; form-data used for multi-part forms, x-www-form-urlencoded used for standard forms, and raw used for a variety of content (JSON, XML, etc.).

In some cases you will also need to add some form of authorization token in the request headers to let the application server know you have the rights to GET or POST information. Postman offers a few options to assist with authorization settings (i.e. Basic Auth, Digest, and OAuth 1.0), but you can always manually input the authorization header if you know the name of the header, the value, and format to send it in.


Even though Postman is primarily designed to work with RESTful API calls, it can also work with SOAP.

To make SOAP requests follow these steps:

  1. Change the method to POST.
  2. Select the raw option for content type.
  3. Set the raw format to “text/xml”.
  4. Manually add the SOAP envelope, header, and body tags as required.
  5. Send requests as needed.

Click to Zoom

Additional information about creating SOAP requests can be found on the Postman blog.

Video Tutorial

This video is a short guide on recording a HAR using Postman in combination with the Chrome developer tools.

Also check out Ruairi Browne’s Postman tutorial covering RESTful API GETs and POSTs to Twitter as well as OAuth1.0 credentials.

Upload a Recording

To load test a mobile application using LoadStorm PRO, a HAR recording must be made to simulate the traffic. Once you’ve decided on a recording method, all you have to do is upload your HAR file into LoadStorm, and you’ll be on your way to load testing your app and ensuring end-user satisfaction. If you have questions or need assistance, please contact, visit our learning center, or leave a comment below.


The post Load Testing Mobile Apps appeared first on LoadStorm.

Maybe I’m Doing Conferences Wrong

QA Hates You - Thu, 02/12/2015 - 03:31

Turn That Soul-Crushing Conference Into a Win:

You’ve spent days wandering the cavernous halls of a convention center, trapped in windowless rooms, drinking too much coffee and talking yourself hoarse. Does anyone ever emerge from a conference as the organizers intended, feeling recharged with new ideas, contacts and energy?

New York City marketing executive Stefany Stanley does. Among conference organizers she is known as a savvy convention-goer, someone with a strategy for rising above the dreary rounds of networking and breakout sessions. Ms. Stanley says she has gained valuable contacts, ideas and insights from the 15 conferences she has attended in the past five years.

The article goes on with tips and tricks to maximizing your meeting other people to sell your services to or to meet people who might help you get a leg up, basically.

I must be doing it wrong; when I go to conferences, I go to attend the sessions and to learn what the speakers have to offer as to professional insight. Maybe I’ll meet someone I know off the QAternet or something, but I don’t count on it, and if I don’t, I don’t think that I’ve lost something.

Of course, I don’t go to enough conferences and conventions often enough to have my soul crushed, and I don’t think of them primarily as mass in-person sales cold calls, so I’m probably doing them wrong when I do go.

But maybe you’ll find the article useful.

Categories: Software Testing

QA and the Motivational

QA Hates You - Wed, 02/11/2015 - 04:58

I go to a dojo filled with positive, encouraging, uplifting people, from the kyoshi to the black belts to the other students. You want to talk about imposter syndrome, and I’ll explain how I feel when surrounded by nice people.

At any rate, members of the dojo often post motivational images to, well, motivate each other. And immediately, my mind works to find the condition where the assertion is not true. I must subvert it.


Your excuses get you 0% closer to your goals. Unless you’re writing a book about your excuses, I think.

I don’t know how my story will end, but nowhere in my text will it ever read… “I gave up.”

I think:

That, my friends, is the essence of the QA mindset: When presented with a proposition, you find some way to subvert or suborn it. Remember that the contradiction of All are…. is not Nothing is…. but Some are not….. (See also the Aristotelian Square of Opposition.)

Plan your tests accordingly.

Categories: Software Testing

Oh, Yes, Grammar Matters

QA Hates You - Tue, 02/10/2015 - 10:00

Grammar Rules in Real Estate

Real-estate agents, better take out that red pen.

An analysis of listings priced at $1 million and up shows that “perfect” listings—written in full sentences without spelling or grammatical errors—sell three days faster and are 10% more likely to sell for more than their list price than listings overall.

On the flip side, listings riddled with technical errors—misspellings, incorrect homonyms, incomplete sentences, among others—log the most median days on the market before selling and have the lowest percentage of homes that sell over list price. The analysis, conducted by Redfin, a national real-estate brokerage, and Grammarly, an online proofreading application, examined spelling errors and other grammatical red flags in 106,850 luxury listings in 52 metro areas in 2013.

Think it applies only to real estate and not your product interface? Are you willing to take that gamble?

You’d better make sure your Web labels, error messages, and helpful text are grammatically correct, or you won’t be able to quantify how many people don’t use your software because they thought it was written by third graders. Because they won’t be your users.

(Yes, I know the story is nine months old, but I’m between contracts right now and have a little time to catch up on my newspapers for the last year.)

Categories: Software Testing

QA Music – I Want My….

QA Hates You - Mon, 02/09/2015 - 03:24

“Mr. MTV” by Nothing More

Categories: Software Testing

Very Short Blog Posts (24): You Are Not a Bureaucrat

DevelopSense - Michael Bolton - Sat, 02/07/2015 - 14:38
Here’s a pattern I see fairly often at the end of bug reports: Expected: “Total” field should update and display correct result. Actual: “Total” field updates and displays incorrect result. Come on. When you write a report like that, can you blame people for thinking you’re a little slow? Or that you’re a bureaucrat, and […]
Categories: Software Testing

How Important is Credibility to Software Testers?

Randy Rice's Software Testing & Quality - Fri, 02/06/2015 - 12:12
I have incorporated the importance of credibility in all my software testing training courses. Why? Because if people don't trust you as a person, they won't trust the information you provide. As testers, the end result of everything we do is information. That information must be timely, correct, and objective.

When teaching the importance of credibility, I always say "Just look at the current events to see how people have lost or gained credibility. You will find examples everywhere."

The issue of credibility came to mind in recent days as I have been watching the response unfold to the new revelation that Brian Williams admitted to incorrect reporting of an incident that occurred while embedded with troops in Iraq. Some might say Williams' account was mere embellishment of the event. Others, however, take a much stronger view and call it an outright lie. Williams has acknowledged the inaccuracy and issued an apology, which is always a good start to rebuild credibility.

In the case of Williams, the stakes are high since he is a well-respected news presenter on a major television network. This is also a good example of how credibility can rub-off on the entire team. Williams' story many well cause many to stop watching the entire network for news reporting. People may ask, "How many other reporters embellish or hide the truth?"

Now other reports, such as Hurricane Katrina, from Williams have been called into question with reliable information from those who were there. Williams reported seeing a dead body floating face-down from his hotel in the French Quarter. The New Orleans Advocate states, "But the French Quarter, the original high ground of New Orleans, was not impacted by the floodwaters that overwhelmed the vast majority of the city."

I have a feeling this story will take on a life of its own. Now, there are accounts from the pilot of the helicopter saying they did take on fire, just not RPG fire. That may help clarify things, but it adds confusion to the issue which doesn't help rebuild credibility.

As a software tester, you may be tempted to overstate how many problems you have seen in a system or application, just to get people to pay attention to the fact there are problems. Don't succumb to that temptation. Instead, just show a really good example of a serious problem. If there are other similar issues, say so. As a tester, you can only report what you have seen. You can't predict what other problems may exist, even though you might be right.

In telling your testing story, the accurate, complete truth is all you need.

If you would like to read my three-part article series on How to Build, Destroy and Rebuild credibility, here are the links:

Building Your Credibility as a Tester

How to Destroy Your Credibility as a Tester

How to Rebuild Your Credibility as a Tester

I hope all this gives us all something to remember as a we report what we see in testing!

I would love to hear your comments, questions and experiences.

And...I would like to wish my son, Ryan Rice, a happy 34th birthday today!

Have a great weekend,

Categories: Software Testing