techno.rentetan.com – A technological malfunction recently occurred at Kyoto University in Japan, erasing a large amount of important data. During what was meant to be a standard backup operation, Kyoto University’s supercomputer system mistakenly deleted 77 terabytes of data, resulting in the loss of valuable research.
Approximately 34 million data from 14 separate research groups that had been utilizing the school’s supercomputing system were lost as a result of this problem, which happened somewhere between December 14 and December 16. The institution has Cray computer systems and a DataDirect ExaScaler storage system, which may be used by research teams for a variety of different applications.
At least four distinct groups’ work will not be able to be recovered after a computer breakdown, although it’s not clear what sort of data were erased or what caused the problem.
When it comes to supercomputer research, BleepingComputer, which first reported on this occurrence, kindly reminds out that operating a supercomputer costs hundreds of dollars each hour.
Details of Kyoto’s terrible episode were first disclosed in mid-December by the highly-regarded institution, which gets considerable grants and financing.
Begun by addressing the audience as “dear fellow supercomputing service users” (translated to English via Google). Today, a malfunction in the storage system’s backup procedure resulted in the loss of certain data in / LARGE0.” The issue has been resolved, but we may have lost roughly 100 terabytes of data, and we are evaluating the extent of the damage,” the company said.
Because of its speed and the capacity to use many computer systems to conduct complicated mathematical equations, supercomputing differs from conventional computing. As a result of its benefits over conventional computers, it may be used in a wide variety of scientific fields, from climate and atmospheric modeling to physics and vaccine research. All of it, though, will be for nothing if your computer malfunctions.