Machine Learning and A.I. predictions for 2018
-
2018 Johnny5
2019 Walle
2020 T100
2021 T1000:p
-
Hopefully the realization that normal curves are only common because the things we measure in the natural universe are often the sums of various things, and that other systems, and particularly systems that don’t lend themselves to us collecting iid-data, like markets, are not necessarily normal.
But the likelihood of that being realized by a large number of people, even the people who ought to realize it, is 0. We learned nothing from 2008.
-
Here’s an article of A.I. Bot crypto trading :
-
one of my hobbies is data mining (i spend a lot of time over at or on projects related to kaggle.com). And to that end have been trying to get together an analysis engine of sorts for stocks. i briefly toyed with the idea of doing analysis of the coin markets but honestly, in the end stocks provide far more information to work with. the latest and greatest stuff i’ve written doesnt use random forests or gradient boosting machines. it uses… genetic algorithms which reinforces itself by breeding best predictors, to make new predictors. anytime an algorithm reinforces a good idea and throws out a bad its basically learning. i only mentioned it cause of all that ai thread. i thought you all might find it interesting.
-
@j_scheibel said in Machine Learning and A.I. predictions for 2018:
genetic algorithms which reinforces itself by breeding best predictors, to make new predictors. anytime an algorithm reinforces a good idea and throws out a bad its basically learning. i only mentioned it cause of all that ai thread. i thought you all might find it interesting.
My first Industrial supervisor task was Eng D project using monti carlo methods to automatically plan work in 1990, he’s now a Professor.
My latest work is a robot song writer, which writes songs in Tab format. I’ve got about 6 months work up-dateing the “Lord of Rings” number of pages of training data, or wait for some improved frameworks to automatically format the data. It is interesting as this is done in ASCI, so you don’t have all the trouble (stopper) of processing text to numbers and normalising any more. In fact the ASCI interface (char-rnn) could be used to predict serial lists of numbers.
So really interested, using a similar approach I would have gone for. In order to make more human decisions a second layer could test and include cross references to “equations”, other serial streams or external information.
For instance the system might find just the (number derived from) daily Asci text of financial headlines from a news paper would help it make a better prediction.
Or for an equation, it is known that under normal circumstances a known variable signal can be predicted by making an average of a certain number of previous readings, depending on the signal. Whilst the system can learn to do that, it is a form of training to give the system information you think it might need.
-
so you are using the lord of the rings text to train the robot song writer? or the actual music from lord of the rings? either way it sounds neat
-
That is very impressive. Especially on analog datasets, its pretty hard to overcome quantization issues.
-
@j_scheibel said in Machine Learning and A.I. predictions for 2018:
so you are using the lord of the rings text to train the robot song writer? or the actual music from lord of the rings? either way it sounds neat
No, It is a set of guitar Tabs, the same number of pages as Lord of the Rings.
-
@wrapper ahh i see. that makes a lot more sense :)
-
of course now i want to see if i can make my ai write music… but of course I dont have a way to say “this is good or this bad” . hmmmmm
-
The updates are to classifying the parts of songs, intro, verse chorus, in a clear consistent way the neural net can learn easier (less noise). Also writing out all the verses, instead of repeats.
I’m about 25% through the latest update, but adding more Tabs would also improve the output.
I’m using a 1400 neural net and 256 character buffer i.e. 8 layers of same size. The system has learned the Tab structure, alternating lines of music and text. It starts songs with a title and composer. It nearly did a couple of rhymes …
-
you sound like you are well beyond this part but you might see if there are any datasets of guitar tabs or music scores at https://www.kaggle.com/datasets?sortBy=hotness&group=public&page=1&pageSize=20&size=all&filetype=all&license=all for you to digest. there are probably far better repositories of information you can use (and you probably have already found them) but i figured i should at least mention it
-
Yes, I joined that (kaggle) when you said about it, I’ve had a look round.
(Music) Not the sort of thing that can be common, except say, midis of classical music, or abc format which is text for, one line tunes like folk dance music. Both of which need a data alignment layer, to reduce the de-noising work.
-
i’m curious how did you decide the number of layers to use? i actually put layer architecture in to the genetic algorithm i wrote. (for various reasons, partly so it could mimic that structure but also so I could import such networks as a starting spot) Because neural nets work fundamentally differently then a genetic algorithms (reinforcing connections vs reinforcing whole models) more than 2 layers never gets me very far.
regardless i was curious of how one decides 8 is enough as it were
-
I did a year of experiments trying to increase the number of layers. The eight layers with 1400 neurons was the maxim I could get, with the extended input layer of 256. Increasing the the buffer helps as neural nets have trouble with memory beyond the buffer.
I would also have like to restrict some layers, as this helps to extract higher level relationships, but that would have meant learning lua and customizing the char-rnn code.
I haven’t done any runs for a while as working on 0.9.6.x. The last time there had been some improvements to the code and I was able to increase the neurons from max 400 to 1400.
Heres some extended layer experiments I did with evolvehtml. https://github.com/wrapperband/evolvehtml
-
Like now, if i watch a video about Apollo - I have be convinced the world is flat.
The A.I. running youtube is trying to convince me all the horrors of totalitarian machine learning systems, that are being used to spy on everyone, are going to be blamed and associated with “Blockchain” and a “Singularity”. False flag against Blockchain?
-
i for one support our new a.i. overlords… hehehe
-
It’s already been the big thing for a couple years.
I can’t tell you specifics of what we’re doing with it, but we’ve had an 8x volta 80Gbit nvlink system on order for a couple months now. :drooling_face:
Imagine what 40,000 cuda cores, 5,000 tensor cores, and HBM crazy-fast memory all ganged up in one box can do these days. Distributed systems have been designed to allow several multi-gpu boxes to be connected. That’s the reason ML/DL has taken off so quickly; this kind of raw power was practically unthinkable a mere half dozen years ago, and is making models only dreamed of before workable today. But then we all knew where gpgpu was headed anyway…
I had picked up a book on neural networks twenty years ago, and wondered then just what good that would ever serve in our lifetime. Who knew.
(btw, quantum computing is at the same stage now that nn’s were back then… :))
-
anything related to this or predictions for 2019?
-
Yep, it will be very interesting to see the predictions for 2019.