By Brendan Cassidy – Open Innovation Manager
At General Fusion we believe that collaboration and the open sharing of results is key to unlocking fusion and transforming the world’s energy supply. That means we are always looking at ways to engage others in our efforts to accelerate our pursuit of commercially viable fusion energy.
We are actively engaged in the fusion community and regularly attend scientific conferences, collaborate with other fusion scientists such as those at Princeton Plasma Physics Laboratory (PPPL), and keep the public informed about our progress through the media.
Recently we’ve gone outside the box by crowdsourcing some of our scientific and engineering challenges. Crowdsourcing has been successful for technical powerhouses such as NASA, so we’re in good company in going to the crowd.
While our thorniest problems are better suited to a targeted group of fusion scientists inside our network, we saw that some of our problems may be solvable by individuals outside of our specific expertise – experienced engineers and physicists, or even garage tinkerers. We opened up three such challenges to the crowd, and it has been thrilling to see their solutions to real problems we face daily in our research and development activities.
Throughout the journey we’ve learned a lot. In the very spirit of collaboration and the open sharing of results, I thought I’d share our insights and have compiled a list of Five Things To Consider Before You Enlist The Crowd.
1) What’s Your Objective?
Do you really think the answer is out there? How about part of the answer? Is this a genuine attempt to encourage meaningful input from the crowd or is it perhaps a public relations exercise?
There are several benefits to crowdsourcing and different ways to do it.
In our three projects, General Fusion wanted to take a sincere, practical approach. The first project, “Method for Sealing Anvil under Repetitive Impacts against Molten Metal”, and the third: “Fast Current Switch in Plasma Device” were hard science engineering problems. We wanted real solutions to very definable problems, and that’s exactly what we got.
To add incentive to the projects, General Fusion offered cash prizes for the most successful entries, with top prizes ranging from $5,000-$20,000. Not bad for a few days’ work. General Fusion is serious about getting real solutions.
The second project, “Data-Driven Prediction of Plasma Performance”, was a bit more of a long shot – an exciting attempt to see if, armed with gigabytes of General Fusion’s data and statistical skills, the crowd could identify patterns in the data that would allow the company to improve the quality and performance of its plasma, a key component in creating fusion energy. We didn’t know if this was even possible, but we wanted to see if anyone out there could help us. This challenge drew the largest number of interested participants – nearly 700 – and while they didn’t exactly come up dry, the results produced nothing new or usable for General Fusion. We did, however, learn a lot about how we at General Fusion interact with our data, what the rapidly growing field of data science can and can’t do for us, and more great stuff about statistics that I will refrain from boring you with.
In addition to getting solutions to our problems, we saw some other interesting benefits from our challenges. There was a great deal of media interest when we announced that we were going to the crowd, so this could be an unconventional public relations approach (note that it is possible to anonymously run challenges, so if media or competitor attention is a downside for you, you can control that).
We ended up contracting one of our challenge winners to work on a project related to the problem he solved. This demonstrates how crowdsourcing could be used for what I like to call the Willy Wonka approach to recruitment.
2) Will the Crowd Produce Anything Worthwhile?
Sun Microsystem’s co-founder Bill Joy is famously quoted as saying “no matter who you are, most of the smartest people work for someone else”. Case in point: the $20,000 prize for the first challenge was won by Kirby Meacham of Cleveland, Ohio, an MIT-trained mechanical engineer with more than 30 years of design experience and his name on 35 US patents.
That said, General Fusion’s scientists and engineers may well have solved the problem themselves if we chose to work on it internally, but because this problem required no special knowledge of fusion or anything specific to our technology, it was possible that somebody such as Kirby Meacham could have solved it for less money and in less time. Given that Meacham’s solution allowed our team to concentrate on other problems, it was very worthwhile for us to be able to draw on his expertise.
As mentioned, the second challenge, Data-Driven Prediction of Plasmas Performance, has not generated any results that our own data scientists haven’t already identified.
The third challenge, Fast Current Switch in Plasma Device, is particularly intriguing. We received forty-nine solutions from seventeen countries, showing just how global the crowd is. Further, twenty-five submissions had what I call a “nugget of a good idea” – something in a suggestion that’s worth exploring further, even if it doesn’t solve the exact problem at hand. First prize of $5,000 went to Vladimir Samara, a post-doctoral researcher at Notre Dame, Indiana, who earned his Ph.D. in plasma physics from the Open University in the UK, underscoring the international quality of the crowd.
3) Who’s Your Partner?
So, your intuition tells you there are thousands, perhaps millions of potential collaborators out there, but how do you find them and how do you work with them? You can’t simply post a challenge to your website and expect thousands of experts across the globe to find it. You also need to design the legal aspects your challenge: what are the terms of an award? Who owns the IP of the winning submission? Of a losing submission?
For our three challenges, General Fusion partnered with InnoCentive of Waltham, Massachusetts, which already has a roster of 375,000 registered solvers from nearly 200 countries around the world. InnoCentive had already conducted successful challenges with the above-mentioned NASA, plus the US Air Force and Proctor & Gamble, among others, so they knew how to run challenges, how to define the terms and prize amounts for an appealing challenge, and most importantly how to reach the right audience of potential solvers.
InnoCentive had three standard challenge formats (with custom formats an option), and we strategically ran one of each. Our first two challenges both offered a conditional prize for a solution that met all requirements, but the first one asked only for a design, while the second one asked for solvers to actually build something. The third challenge offered a guaranteed payout to be split among a handful of top submissions, and asked only for undeveloped ideas rather than detailed designs. In this third challenge, we got IP rights for all submissions, while for the first two only solutions awarded prizes transferred IP rights.
Sound a bit confusing? It can be, but InnoCentive helped us navigate the whole process and suggested which structure of challenge was best suited for each problem.
There are many different companies offering a variety of crowdsourcing platforms. Start by finding a few platforms that seem to fit your types of problems, and get more information directly from them – most will spend a few hours on the phone or in person with you to discuss how their approach to crowdsourcing could work for you.
4) How Much is a Solution Worth to You?
If you run a challenge with a prize of $10,000, the total cost to you is actually several times more than that. There are three main costs to running a crowdsourced challenge: the prize amount paid to the winner(s), the cost of posting the challenge through your partner (in our case, InnoCentive), and the internal cost to your business to define the challenge and to evaluate the submissions.
You need to pick a prize amount that is appealing to potential solvers. You’re asking a bunch of people to invest their own time to work on your problem without any guarantee that they will get anything in return, so the prize needs to be a relatively big reward for their effort. Trust your partner’s experience in selecting the prize amount.
Your partner isn’t working for free either: these crowdsourcing platform companies charge a fee to help you design your challenge, handle the legal IP agreements, promote the challenge, communicate with solvers, etc. Be sure to get an idea of this cost going in.
Even though the purpose of crowdsourcing is to outsource your problem solving to the crowd, the quality of your solutions will be dependent on how well you define the problem. Just like any outsourcing, you have to very clearly define what you are looking for and (perhaps more importantly) what you are not looking for. You will probably have to answer questions while the challenge is running, and you will have to evaluate every submission you receive. This takes your time, and you know better than anyone that your time isn’t free.
Before you crowdsource a problem, add up all of these costs. Get an idea from your partner for the prize and fee amounts. Estimate how much time you think you’ll spend setting up the challenge and evaluating the results, then triple that amount (no, seriously). If getting a solution to your problem isn’t worth that amount, maybe don’t crowdsource that particular problem, or perhaps revisit your particular approach to that problem – different types of challenges cost different amounts.
5) Do You Understand the Risk?
By its very nature, a crowdsourced challenge is a shot in the dark. The participants are fully briefed with the information you give them, but they are outsiders addressing the problem in this specific context for the first time. There’s no guarantee that a challenge will produce a useful solution, and even if you don’t have to award any prize money, you have still invested time and money into running the challenge.
Although our data challenge did not produce any results we hadn’t already thought of, many very competent people put a lot time into it, and the team here decided to award smaller prizes to two challengers who had gone above and beyond in their solutions. One of these turned out to be a pleasant surprise – Ian Caven may have collaborated with us virtually through a global tool, but he happens to be right here in Burnaby, not far from General Fusion’s headquarters. Not only that, he won a technical Oscar in 2012 for advancing image processing technology for Hollywood. We of all people shouldn’t be surprised that innovative solutions can come from unlikely places, but it’s always a pleasant reminder.
When considering the risks in crowdsourcing, it’s important to remember that you take a risk every time you tackle a new problem. Sometimes you spend a lot of time internally and come up empty-handed, or take many times longer to solve a problem. Don’t immediately eliminate crowdsourcing because it isn’t a guaranteed victory. Instead, use this information to calculate the different costs and risks up front when evaluating the best way to tackle any given problem.
Risks aside, this is a unique way to conduct a world-wide brainstorming session. As I was overseeing the project from General Fusion’s end, the advantages of virtually gathering together hundreds of thousands of engineers, scientists and garage tinkerers from around the world were clear. Imagine how difficult that would be to accomplish in person – “How about next Thursday? Can you all do next Thursday?”
The Bottom Line
The bottom line? Crowdsourcing challenges take time and resources to run, but if you plan things right (and bear the points above in mind) it’s possible to get some intriguing new perspectives on a problem and a net return on your investment. Though you’re not about to see us crowdsource every aspect of our development, our experience with crowdsourcing has taught us that if you ask the right questions, the wisdom of the crowd may just provide the answers you didn’t even know were out there.
About the Author
Brendan Cassidy is a systems engineer and project manager at General Fusion. Before spearheading General Fusion’s crowdsourcing initiative, he managed the construction and operation of General Fusion’s PI2 plasma injector, the world’s largest such device. In the spirit of open innovation, he is now working on a program to enable sharing of General Fusion’s terabytes of experimental data with the scientific community.