I’ll just add this 3D cube animation – which, of course, the Codex helped to write.
Also, I know nothing about coding 3D graphics, so this one took a while to generate.
I believe it was George Hotz who said, at an interview with Lex Fridman, that – neural network is compression.Source: Youtube
Having written a MLP from scratch, and researched the whole training process and data shebang – I agree on that.
Neural nets compress not only huge amounts of data but also semantics, context and many other relations between small and large scopes.
#Define a python function which is a very compact tetris game. #Display playing field using pygame library.
import pygame
Parameters are the default parameters in Playground. Temperature: 0.7 (or less if results too random). Check out the github code, all the parameters are there too.
Or, in the spirit of generating tributes to a game, one could use an open source remix of the original tune… I’m just brainstorming here, this is not a goal I would pursue.