Speeding the search for gravity waves

El_Machinae

Colour vision since 2018
Retired Moderator
Joined
Nov 24, 2005
Messages
48,283
Location
Pale Blue Dot youtube=wupToqz1e2g
Well, with every volunteer, the process gets faster. With each computer innovation, the process goes faster.

But, this fellow tweaked the code to half the time it takes to do a calculation. That's awesome. He's basically doubled the value of every watt contributed to this effort.

I really think that distributed computing will help roll us forward with quite a few projects. What's needed are visionaries who can see problems being broken down into components ... and I guess they're out there.

Remember when there was an outcry regarding the funding of Seti@home? Well it looks like that effort has had spin-off benefits as the expertise migrates to other fields.

A global effort to detect gravitational waves has received an unexpected boost after a volunteer improved the computer code used comb through data from ground-based detectors.
...
Each computer involved in Einstein@home is given signal from a massive spinning neutron star, called a pulsar, to analyse. By searching for deviations in the pulsar's signal the computer can flag it up for closer analysis, possibly revealing ripples in space-time.
...
Fekete analysed the code given to his computer and soon realised he could improve it. ... He modified the code and submitted it to the project co-ordinators. The modifications at least double the speed of calculations, says James Riordon, a spokesman for the APS. The scientists behind the project hope to distribute the modifications to all users when the next batch of data is ready for analysis in June 2006.

Newscientist

What other types of projects do people think could be done with distributed computing? What would be the value?

This is viable alternative for people who think that folding proteins is a waste of time.
 
@home programs don't agree with my computer. :(
 
Slows down me comp as well, I had it for a while, then dropped it. And does it really do anything? What is it suppose to actually accomplish? How does it use other people's computers?
 
Homie said:
Slows down me comp as well, I had it for a while, then dropped it. And does it really do anything? What is it suppose to actually accomplish? How does it use other people's computers?
You have data, you need to perform tedius repetitive iterative calculations on it, so you just send it somewhere else to process it.
 
Its a good idea, they just need more soundproof programs (that don't crash other programs).

My main problem is the gargantuan fan that fires up when the program is running. Quite annoying when surfing the web (I have to turn up the speakers substantially to play CivIV).
 
Perfection said:
You have data, you need to perform tedius repetitive iterative calculations on it, so you just send it somewhere else to process it.
Just curious, wouldn't it take longer to send the data back and forth between the PC and the Supercomputer than if the supercomputer just did the calculations itself?
 
Homie said:
Just curious, wouldn't it take longer to send the data back and forth between the PC and the Supercomputer than if the supercomputer just did the calculations itself?

Given that @home programs are thriving - no. :)

These aren't simple calculations - your computer isn't just adding two numbers and sending them back. It's performing complex calculations that sometimes take hours.
 
To add to what he said.

They could purchase a supercomputer to do the calculations. But that would cost gads of money. Meanwhile, people have these great computers at home that are not being utilized anywhere near capacity. To buy a supercomputer would be wasteful.

The supercompter acts as a manager, and all the home computers do the grunt work. And they work on each data packet for a couple days (it only takes about a minutes to load/upload the data back and forth).

By analogy, I'm an absolutely awesome ditch-digger. I'm really really good. And there are people who want a ditch dug. I cost $80/hour - I'm *that* good.

So, I could dig ditches. Really well. But there are a host of people standing around, doing nothing, who'd be willing to dig ditches. Sure, they can't do it near as well as I can, individually. BUT, if I were to spend my time managing them, their collective efforts will get more done than if I just worked on my own.

So, I have a choice. Dig the ditch, or manage all the volunteers? The supercomputer gets more done by managing volunteer computers. And we benefit, because we crack the mysteries of physics faster.
 
Much as distributed computing seems to be a good idea, I find they interfere with too many programs to run in the background, and so I don't run them.

I also have to say that this particular one doesn't seem a productive exercise, especially compared to things like protein folding. Even the climate prediction and SETI ones seem to offer more tangible benefits.
 
I hope that a "reduction CO2 outlet" project will show up - apparently quite a lot tend to forget that they consume extra power running these applications and thus emit even more CO2 than they would otherwise. :)

It is of course a matter of weighing the downsides against each other, but a fun oxymoron nevertheless.

I'm pretty sure that the LHD (Large Hadron Collider in(/near?) CERN, helping to solve the Big Bang mystery by searching for the Higgs particle amongst others) will - expect to - require outside help, especially considering the terrabytes of information the project will have to surmount monthly.

Not believing whole-heartedly in global warming and CO2 emissions as the big boogeyman I'll have few quarrels with myself in helping that particular project out.
 
Back
Top Bottom