Redesigning the Project Summary Page

By Community Team

A New Look

Over the course of a month, we typically serve up about 14 million project summary pages. These landing pages are the first entry point your project listing at SourceForge, providing a comprehensive at-a-glance summary of all the great things your project has to offer in order to enable a visitor of that page to make an informed decision whether or not they’d like to download and try it.

We’ve spent the past month hard at work optimizing the page’s visual appeal as well as its ability to transport the information to the visitor in the most effective way possible. The results of these efforts launched publically this week and we’d like to take the opportunity to talk more about where we came from and why certain design decisions were made.

Before/After

The primary goal of the new design was to improve the presentation of the information about each project, which increases the likelihood of converting a mere visitor of the summary page into a downloader of your software.

Let’s take a look at the elements that make up the redesigned page.

A Quick Tour

Annotated version

  1. We introduced a new project summary field, limited to 70 characters, which ideally contains a one sentence description of your project. We show this prominently very high up on the page for an easily scannable abstract of what your project does.

  2. Installing software is a matter of trust. We show the number of recommendations a project as received as well as the number of downloads it generated over the course of a week to generate a level of trust. Both numbers are linked to more detailed counterparts if a user is eager to learn more.

  3. Our trusty download button has been placed prominently near the top of the page and expanded in size to make it the obvious click target for the next action after evaluating the presented information.

  4. Easy social sharing buttons allow a user to tell the world about their great find with a single click.

  5. Screenshots communicate easily what a project looks and feels like when installed. We present all of them in a thumbnail carousel with the full images available in a slideshow.

  6. The full description and feature bullets provide a more in-depth outline of what the project’s primary features are and provide the more curious user with additional information to evaluate before proceeding to a download.

  7. We provide a rich set of additional meta data about each project in this sidebar, including easily scannable icons for the supported platforms the software runs on, the programming language it is written in, and the available translations.

  8. The most helpful reviews are also included on the main summary page instead of being hidden in a separate tab. The list of top reviews also includes the current recommendation score as a percentage of users recommending the project.

But how do you get to authoritative numbers of how effective a redesign really is? What follows is a quick primer on the testing methods we employed to achieve our goal.

Test, Rinse, Repeat

The new design was built based on a slew of data from our visitors. Not only were those insights useful in shaping the design, but they are also of interest to project admins looking to increase downloads. We ran two types of tests: A/B tests and user tests. The A/B tests allowed us to see, with large sample sizes, how well the new design helped users to achieve the goal of downloading the project. In other words, it told us whether or not our changes worked. The user testing, on the other hand, provided us with insight into why it did (or didn’t ) work.

In an A/B test, a certain percentage of web site visitors sees the normal page (variant A) while the rest see a variation (variant B). A goal is defined for both pages and whichever page is more successful at achieving that goal wins the test. If you’re interested in learning more, check out these resources:

While AB testing provides great quantitative data, it doesn’t provide any insight into why users are behaving the way they do. For that we turn to user testing, which is typically conducted with a much smaller sample size, 3-4 users. User testing usually involves bringing in a user, providing them with a script for a particular task on the site, and then recording them as they walk through the test. Users are encouraged to verbalize their thought process so that it’s clear why they do what they do.

To dig into user testing, read up on these excellent articles:

The First Results Roll In

First revision

We launched our first A/B test of the redesigned project summary page with a layout similar to the one you can see above on September 30th. It ran for a week, and, what can we say, it didn’t go well. (The astute reader will notice that the design we actually launched with looks quite different, so this shouldn’t come as a surprise to them.)

A/B Testing Rev. 1

The original, untouched project summary page outperformed the newly designed page by a non-marginal amount in converting visitors to downloading the project.

To get more insight into what actually made matters worse we setup a test scenario with UserTesting.com, giving testers a simple set of steps to perform on the page (“Evaluate and download the software”) and report back the results. What’s nice about UserTesting.com’s offering is that they ask their testers to record a video of the process, which gave us great insight into the issues the testers (and likely the users, in the A/B test) were struggling with in the new layout.

The main issue we identified from the test results was that the download button ended up being way out of sight where we originally placed it in the right sidebar. Oftentimes, testers would mistake the whole region for a banner ad instead of additional information about the project and would simply overlook our traditional download button.

Additional comments from testers confirmed that screenshots are the number one target they jump to when evaluating a new piece of software.

The testers also mostly ignored the full description that we printed right below the project name, which eventually lead to the introduction of the 70 character summary field that is now prominently sitting right between the project name and the screenshots in a slightly larger font size.

The last step of the scenario that we asked testers to go through was to share the project summary page with their friends on a social network. The sharing buttons were initially also placed in a sidebar and testers weren’t always able to discover them quickly.

Back to the Drawing Board

With the insight gained from the user testing we went back to the drawing board to optimize the presentation of the various elements. We were confident that the general updated visual appeal was going into the right direction and started re-arranging elements according to their priority in aiding a visitor to evaluate a given project.

The second revision of our redesign launched into A/B testing on October 7 and also ran for about a week. This variation outperformed the original as you can see in the graph below.

A/B Testing Rev. 2

Still, we were curious what kind of feedback we would get from user testing and asked a new round of testers to go through the same scenario as before.

The testers confirmed that our changes now made it a no brainer to download a piece of software and share it with their social networks afterwards, which means that we accomplished the mission of making the page both visually more appealing and functionally more successful in converting visitors to downloaders.

Still, we identified a couple of minor issues with the help of user testing, mainly around the presentation of reviews, that we addressed prior to publically launch the page.

Conclusion

Using state of the art testing methodologies we were able to launch a redesign of one of the key pages on SourceForge based on numbers to back up our design choices. We were able to make informed decisions and fail fast with revisions of the design that simply didn’t work.

The new design of the summary page is universally better at driving software downloads by presenting project information in an improved fashion, thus communicating a project’s features and benefits clearly to a potential user. It also needs to be said that we will continually monitor our conversion rates and will continue to improve the layout of the page.

However, we also learned about a few differentiators that will give you, as a project owner, the chance of improving the key factors in the presentation of your project, thus increasing the chances of getting additional downloads of your software.

  • • Visual and graphic elements are the main factor for users evaluating your project. Please make sure you upload several representative screenshots showing your project in action. Also, we’re using project icons in various places for cross promotion of projects. Additional A/B testing in these areas showed a massive improvement in conversion rates for projects that have a project icon set. The combination of these factors can increase downloads by up to 90%.

  • • Well-worded, easily understandable project descriptions that communicate the purpose of your project clearly to the end user are another positive element for project presentation. We recommend that you make use of the new summary field as well, which gives users the ability to evaluate your project’s goals even quicker, without resorting to the full description. You should also make good use of the expanded limit for the full project description, which grew from 250 to now 1,000 characters.

  • • Reviews, even if negative, are also very important as an evaluation gauge for users. Users will often explicitly honor a large number of recommendations and positive reviews, but are curious to check out negative feedback as well.

With that, we’re closing our review of the summary page redesign process and, as always, welcome your feedback and bug reports in the comments and our support system. Or swing by IRC, #sourceforge on freenode, and talk to the developers behind the redesign in person!