Explorer vc.ru figured out how to use the resources of the “sleeping” computer hardware and networking — that they were not idle. A significant part of the resources of the computer is often idle — owner leaves to brew tea or participates in planning meetings; bought a “gaming laptop”, and the time is here for the third month is only enough for work; launched a small farm of graphics cards, but dropped the mine or to do a rendering so who knows what else. Unused computing power can be necessary to someone else — the neighbor on the local network, which is a three-dimensional modeling; scientists in small science centers who do not have their own supercomputers, but only available resources of volunteers, and enthusiasts involved in the search for extraterrestrial life.
And they really need hackers to create a “zombie network” for the organization of DDoS-attacks, but about it separately. The ideological foundations of distributed computing were laid long ago. The first experience of accessing resources “idle” devices dates back to 1973 when two employees of the research center Xerox PARC – John Hupp and John Schoch wrote a program that night carried out calculations on computers connected to the LAN center.
Twenty years later, in 1993, Eric Schmidt, who was then working for Sun Microsystems, said. “When the network becomes as fast as the processor, the computer itself will cease to exist, he goes viral”. This idea formed the basis of the concepts of cloud services and grids, the interaction with which the user is not important, how fast is the device, and is only relevant channel bandwidth. But cloud computing is the centralization and concentration calculations on the side of the companies-owners of servers. There is another way that allows individual users to come together to address global challenges, including in those processes a large Corporation.
Within a year after the speech, Schmidt proposed the idea of the project of the voluntary distributed computing, who eventually became the most famous of them is [email protected], which searches for signals from extraterrestrial civilizations. The idea to abandon the concentration of computing power in one place has an important advantage over supercomputers — distributed systems have the potential of unlimited increase of its productivity due to free scaling. No less important for understanding the meaning of volunteer computing and metacomputing the idea of crowd-sourcing and utility computing. The basis of UC is the idea that the ability to access remote resources that users are willing to share with each other, can significantly improve the overall performance of computers in the world. For example, David Anderson, head of [email protected] — sees in the development of distributed computing, the possibility of creating what he calls “Internet-based worldwide operating system” Internet-scale Operating System (ISOS), which users can not only share resources but also to earn on it.
At the same time, distributed computing is not a horizontal structure, unlike, for example, P2P. Here there is a certain hierarchy and subordination General the great task to which users direct their resources. Participation in voluntary computing is not the only opportunity to fully utilize their computing power. You can simply run torrents with bibliographic rarities or old games and actively give them away to fight for the idea of free content distribution, participating in the P2P transfer content to surrender their resources to lease or mine for cryptocurrency them where permitted.
Even if you are confident in your law-abiding, there is also not without pitfalls — for example, in March 2015 one of the updates µtorrent is installed on computers of users the program Epic Scale, without the knowledge of the owner engaged in bitcoin mining. In Russia, this is to be careful and follow the laws and enforcement. Any freely idle resources of interest to the hackers, creating a “zombie network”. On the other hand, there are a number of projects that use DDoS attacks as a form of civil disobedience, when users on their own initiative have provided the resources to create artificial load on certain websites.
Quite a lot of them. The part is organized on a fully voluntary basis, part offers participants financial incentives in one form or another, some created around a powerful community competing “who cheated”, and others focus on the fact that the tasks solved by them attractive because of its value for mankind. Today most of the volunteer computing projects focused on the BOINC software package that enables researchers from around the world to access resources that provide volunteers. BOINC crossplatforming, relatively easy to set up, and requires minimal attention from the user. His client will be “under the screensaver,” and does not touch the resources that are necessary to the owner, referring to the free (or, if you configure it differently, to those who allocated it).
Six years integrated performance of projects on BOINC increased from 5.2 to 28.7 petaflops, leaving far behind the indicators of the peak performance of the worlds most powerful supercomputers (for example, the Japanese “K computer” processes of 8.16 petaflops of data). [email protected] project, University of California, Berkeley, is engaged in the search for extraterrestrial civilizations and since 1999, handles the radio signals picked up by the telescope of the Arecibo Observatory. During this time, were investigated signals with 98% of the observed sky (albeit in a fairly narrow range). In the project took part more than four million users.
[email protected] is looking for in outer space pulsars are sources of a special kind of electromagnetic radiation. The main objective of the project has long been considered a confirmation of Einsteins theory about gravity waves, but in the end the signal that proved their existence was too short to be in the area of data processed by the project. Despite this [email protected] continues to operate, identifying new sources of gravitational radiation.
[email protected] is another project about deep space. In the framework of distributed computing capacities are used for three-dimensional modeling of the formation processes of galaxies. Closer to Earth projects — [email protected] studying small celestial bodies, Climate Prediction and modelling of weather conditions on the Ground are still many places where the meteorologists of their insufficient computational power for calculations of weather forecasts.
There are on BOINC and the projects working on the tasks of the microworld. So, [email protected] is designed to help with the calculations of physicists working with the Large hadron Collider at CERN. Own resources of a distributed network, although it is the most powerful in the world, they sometimes lack. [email protected] simulates the structure of proteins that helps in finding a cure for HIV, cancer, Alzheimers and malaria. There are separate projects, the capacity of which is aimed at fighting specific diseases.
The mathematical computation involved, for example, PrimeGrid and [email protected] A separate branch projects WorldCommunityGrid IBM. In the framework of the WCG calculations related to renewable energy, drug discovery (e.g., Ebola virus) and the human genome. There are projects whose infrastructure is not built on BOINC. Among them TeraGrid, seeking clusters in the Universe “dark matter”, MoneyBee calculating the volatility of the equity markets or NEESgrid, which allows geologists and architects to model seismic activity.
Projects, or otherwise offering direct income, look too quack, but there are those that act on the lottery principle — here, a predetermined prize is paid to the participant on whose device was achieved at some intermediate or final conditions of the project. Distributed.net — community projects for decryption (RSA Challenges). Project status for August 2016 enumerated 20% of the 264 keys. GIMPS using special form Prime numbers, is sponsored by the Electronic Frontier Foundation offers a $100 thousand for finding Mersenne primes with the number of digits in 10 million.
The first prizes were received by Nana Hajratwala calculate the number of length is a million characters in 2000. Have since received three more numbers. Projects such as the ProcessTree Network and Parabon Computation prompt users to provide their resources to commercial customers. To participate in the distributed computation at the level of ordinary volunteer almost nothing is required — just install the client (e.g. BOINC), to choose a suitable project (for example, on the website of the Russian-speaking community), to customize the configuration, specifying a maximum allowable load and connection time and ready.
The shell loads the task, the device processes them when he has the opportunity, and then the client sends the obtained results back to the parent server. If you want to be not just user — you can try to start your project. There are many tasks, which is enough for even a relatively small amount of resources. Example — filtering of realtors and owners of sites for real estate search Sobnik extension for the Chrome browser, developed by one of the users “Habrahabr”, which is just under two hundred members of the distributed computing system has processed more than 60 thousand ads per day.
If you are interested and have the skills for distributed computing you can use the components of “Internet of things” or games consoles — the PlayStation 3, for example, the program for calculations in the framework of [email protected] was preinstalled by the developers.