I wonder if changing to two-dimensional weights will make feedforward and backprop a bit faster.
If the index to the weights is calculated in the top level BASIC interpreted code, it might take a little more time than
if it were calculated in the compiled Annex code.
I guess that I'll have to try it and see.
Captain's Mistress
-
- Posts: 565
- Joined: Tue Jun 21, 2022 2:17 pm
- Location: South coast UK
- Has thanked: 331 times
- Been thanked: 184 times
Re: Captain's Mistress
I'm sure mine has groked. It is certainly finding an "alternative" agorithm.[Local Link Removed for Guests] wrote: [Local Link Removed for Guests]Mon Apr 15, 2024 10:07 am This is an interesting article:
https://www.quantamagazine.org/how-do-m ... -20240412/
I don't think that my net "groks" Connect 4 yet.
I just need to find what problem that algorithm is solving

Nice to know that even MIT don't fully understand what's going inside a neural net.
They boast that theirs "slowly diverges" to the correct solution.
If slowly is something to boast about, then mine is even better. It's VERY slow.
Also, they state that theirs gets better when it forgets.
Mine never learned in the first place, that's a whole step less than theirs.
Looks like I'm well ahead of MIT.

-
- Posts: 565
- Joined: Tue Jun 21, 2022 2:17 pm
- Location: South coast UK
- Has thanked: 331 times
- Been thanked: 184 times
Re: Captain's Mistress
For working out the indexes I've moved away from nested For/Next loops that calculate it.[Local Link Removed for Guests] wrote: [Local Link Removed for Guests]Mon Apr 15, 2024 12:36 pm I wonder if changing to two-dimensional weights will make feedforward and backprop a bit faster.
If the index to the weights is calculated in the top level BASIC interpreted code, it might take a little more time than
if it were calculated in the compiled Annex code.
I guess that I'll have to try it and see.
Instead, I'm now using nested Do/Loop and increment an index counter in the inner loop and reset it in the outer loop.
Doing that avoids several multiplcations during each loop.
Interestingly, I had a 2x2x7 NN which was working really well with just 3 patterns to learn.
When I increased the pattern count, it maxed out with a total error of ~0.25.
Thinking the issue was neuron count, I increased it to 3x5x7 and then got an array subscript error.
Turns out that I'd forgotten to rename the arrays after a cut-n-paste and was updating the input weights with the hidden weight deltas.
The weird thing was how well it performed.
After correcting that fault, things just got worse. Even with a higher neuron count and an overnight run, it just keeps trending towards an average of the outputs.
I had this problem earlier when I first started which turned out to be an error in the derivative calculation but I can't find the cause this time.
For a single pattern, it gets all outputs spot on to better than 6 decimal places, but more than 3 and I only get an average.
I may well try it out in VB6 so that I can send all weights and neuron outputs to screen to see what's going on.
Doing that with WLOG would be a bit too painful

-
- Posts: 236
- Joined: Thu Mar 02, 2023 10:15 pm
- Location: germany
- Has thanked: 111 times
- Been thanked: 58 times
Re: Captain's Mistress
Hello @ all here..
An interesting discussion, I have tried the program now because I have only been in own of this beautiful LVGL display for a short time.
Everything worked right away but I always "never won", I'm not a gamer!
But the topic of "neural network" appealed to me. I have to think about replacing my last gray brain cells hahaha- welcome to the "club of the old gentlemen"
Of course, I also have confidence to look behind the scenes of programming. Aria, OPERA's AI helped me well with this and also output an understandable "pseudobasic code"
as an example:
great work !
greetings Ron
An interesting discussion, I have tried the program now because I have only been in own of this beautiful LVGL display for a short time.
Everything worked right away but I always "never won", I'm not a gamer!
But the topic of "neural network" appealed to me. I have to think about replacing my last gray brain cells hahaha- welcome to the "club of the old gentlemen"
Of course, I also have confidence to look behind the scenes of programming. Aria, OPERA's AI helped me well with this and also output an understandable "pseudobasic code"
as an example:
Code: [Local Link Removed for Guests]
' Sigmoid-Funktion
FUNCTION Sigmoid(x)
RETURN 1 / (1 + EXP(-x))
END FUNCTION
greetings Ron
Modules : 3xESP32-Cam MB (Chip"DM ESP32 S" ),AI-Thinker Audio Kit (ES8388), ESP32 Dev Kit with Display
-
- Posts: 565
- Joined: Tue Jun 21, 2022 2:17 pm
- Location: South coast UK
- Has thanked: 331 times
- Been thanked: 184 times
Re: Captain's Mistress
Unfortunately this project has come to a grinding halt.
The code I published will play a respectable game but it refuses to learn
I've not had the time to progress the problem but having presented the issue to AI myself, there is hope.
The AI suggested that using "deep Q-Learning" would be a better method. (with a 42x128x7 NN behind it).
It also suggested ReLU as an activation function rather than Sigmoid. (far less processor intensive)
I got it to produce some VB6 code that looked promising but it included a very deeply nested set of For/Next loops which Annex would struggle with.
That could easily be worked around (by using a few "goto"s)
Meanwhile, if anyone has any suggestions, I'm open to ANY suggestions.
The code I published will play a respectable game but it refuses to learn

I've not had the time to progress the problem but having presented the issue to AI myself, there is hope.
The AI suggested that using "deep Q-Learning" would be a better method. (with a 42x128x7 NN behind it).
It also suggested ReLU as an activation function rather than Sigmoid. (far less processor intensive)
I got it to produce some VB6 code that looked promising but it included a very deeply nested set of For/Next loops which Annex would struggle with.
That could easily be worked around (by using a few "goto"s)

Meanwhile, if anyone has any suggestions, I'm open to ANY suggestions.
-
- Posts: 236
- Joined: Thu Mar 02, 2023 10:15 pm
- Location: germany
- Has thanked: 111 times
- Been thanked: 58 times
Re: Captain's Mistress
The bad child doesn't want to learn...*smile
hmm I also missed a file in which the learning results are stored or did I overlook that??
Ron
hmm I also missed a file in which the learning results are stored or did I overlook that??
Ron
Modules : 3xESP32-Cam MB (Chip"DM ESP32 S" ),AI-Thinker Audio Kit (ES8388), ESP32 Dev Kit with Display
-
- Posts: 565
- Joined: Tue Jun 21, 2022 2:17 pm
- Location: South coast UK
- Has thanked: 331 times
- Been thanked: 184 times
Re: Captain's Mistress
No you didn't overlook it. It's not there.[Local Link Removed for Guests] wrote: [Local Link Removed for Guests]Thu Feb 20, 2025 10:47 am The bad child doesn't want to learn...*smile
hmm I also missed a file in which the learning results are stored or did I overlook that??
Ron
I only published the code that plays enough of a game to teach the NN.
The NN could not 'remember' what it previously learned when presented with a new game, so I never bothered with saving the weights.
Now that you've brought the subject up again and AI has given me some pointers, I might have another go, but don't hold your breath

-
- Posts: 236
- Joined: Thu Mar 02, 2023 10:15 pm
- Location: germany
- Has thanked: 111 times
- Been thanked: 58 times
Re: Captain's Mistress
I found this video here and think this is "our way" ... Is it endless?
https://www.youtube.com/watch?v=H9ywVLvY6bg
https://www.youtube.com/watch?v=H9ywVLvY6bg
Modules : 3xESP32-Cam MB (Chip"DM ESP32 S" ),AI-Thinker Audio Kit (ES8388), ESP32 Dev Kit with Display
-
- Posts: 565
- Joined: Tue Jun 21, 2022 2:17 pm
- Location: South coast UK
- Has thanked: 331 times
- Been thanked: 184 times
Re: Captain's Mistress
Hi RonS,
Please don't be offended but IMHO that is one of the worst descriptions I have ever seen attempt to describe a NN!
Also, his example problem is not a very good example for a NN to solve. It would be much better to hard code the solution.
What I did get from his video was a thing called the "Dunning-Kruger Effect". Maybe he suffers with it?
(I'd not heard of that term but have met quite a few who have it
)
There are MANY MANY MUCH better explanations out there.
Please don't be offended but IMHO that is one of the worst descriptions I have ever seen attempt to describe a NN!
Also, his example problem is not a very good example for a NN to solve. It would be much better to hard code the solution.
What I did get from his video was a thing called the "Dunning-Kruger Effect". Maybe he suffers with it?
(I'd not heard of that term but have met quite a few who have it

There are MANY MANY MUCH better explanations out there.