A supercomputer system based in Japan's top research facility, Kyoto University suffered from a technical error. The glitch resulted in a massive loss of 77 terabytes of data which was undergoing routine backup at that time.
Millions of wiped files reportedly came from several research organizations.
Supercomputer System Technical Glitch
According to a recent report from Gizmodo on Friday, Dec. 31, the incident was believed to have taken place between Dec. 14 and Dec. 16. The unexpected error wiped millions of files which could go for about 34 million in numbers.
The report added that these files came from 14 various research institutions which rely on Kyoto University's supercomputers. Moreover, the Japanese academy runs Data Direct ExaScaler storage as well as the Hewlett Packard (HP) Cray computing system which are entirely utilized for research studies.
At the time of writing, the university has not yet identified the exact nature of the removed files nor the main cause of the technical error. The school earlier announced that the files of at least four groups will not be retrieved anymore.
Related Article: AMD Is Equipping Singapore's FASTEST Supercomputer With EPYC Processors
How Costly is Supercomputing Research
As Bleeping Computer pointed out in its report, supercomputing research is not something that should be thrown off easily. Depending on the gravity of the research, the hourly operation could cost hundreds of dollars alone.
After the unexpected incident that Kyoto experienced this month, the university has published further information about the supercomputer storage data loss.
Here's what the school posted on its page (translated in English).
Dear Supercomputing Service Users
Today, a bug in the backup program of the storage system caused an accident in which some files in / LARGE0 were lost. We have stopped processing the problem, but we may have lost nearly 100TB of files, and we are investigating the extent of the impact.
We will contact those affected individually.
We apologize for the inconvenience caused to all users.
The most interesting thing about supercomputing is that it exceeds the usual work done on normal computing. It uses highly-complex mathematical calculations to carry out computer processes in the system.
Moreover, experts are exploring its uses through incorporating supercomputers in several areas including physics, climate change, and other fields of research.
In a Dec. 25 report from Interesting Engineering, the Jean Zay supercomputer of France became the first HPC to feature a photonic coprocessor. Instead of using electric current, the machine relies on light to process all of the information.
Predictive Supercomputers
Earlier this year, Tech Times reported that astronomers used the ATERU II supercomputer to simulate 4,000 instances of the universe. The researchers who operate from the National Astronomical Observatory of Japan (NAOJ) mapped models to see the early state of the universe.
Another report from the same tech publication in February tackled how the world's fastest computer could predict tsunamis in real-time. Through the application of artificial intelligence (AI), the experts could finally create 20,000 possible scenarios about natural disasters.
Since Japan is in the Pacific Ring of Fire, the experts need to take action in this situation. This way, they could warn people ahead of time if there's an impending tidal wave that will hit a specific area.
This article is owned by Tech Times
Written by Joseph Henry