Cracking Random Number Generators using Machine Learning – Part 1: xorshift128 – NCC Group Research

Cracking Random Number Generators using Machine Learning – Part 1: xorshift128 – NCC Group Research

This blog post proposes an approach to crack Pseudo-Random Number Generators (PRNGs) using machine learning. By cracking here, we mean that we can predict the sequence of the random numbers using previously generated numbers without the knowledge of the seed. We started by breaking a simple PRNG, namely XORShift, following the lead of the post published in [1]. We simplified the structure of the neural network model from the one proposed in that post. Also, we have achieved a higher accuracy. This blog aims to show how to train a machine learning model that can reach 100% accuracy in generating random numbers without knowing the seed. And we also deep dive into the trained model to show how it worked and extract useful information from it.

21 mentions: @David3141593@munawwarfiroz
Date: 2021/10/15 17:29

Referring Tweets

@David3141593 This is a rather interesting approach. The article discusses xorshift128, but in theory it could work with any similar function. I'm a bit on the fence as to whether it's a genuinely useful application of ML or not.
@munawwarfiroz Yet another way to predict JS's Math.random(): This time with a specialized neural net.

Related Entries

Read more [2009.05673] Applications of Deep Neural Networks
1 users, 13 mentions 2021/01/25 12:58
Read more GitHub - adafruit/circuitpython: CircuitPython - a Python implementation for teaching coding with mi...
6 users, 7 mentions 2021/01/27 12:58
Read more Branch Specialization
0 users, 12 mentions 2021/04/06 16:49
Read more Few-shot learning in practice: GPT-Neo and the 🤗 Accelerated Inference API
1 users, 16 mentions 2021/06/03 22:48
Read more 16 New ML Gems for Ruby
4 users, 11 mentions 2021/06/20 12:17



安定したサイト運営のためにGitHub sponsorを募集しています。