Copy Link
Add to Bookmark
Report

Neuron Digest Volume 05 Number 48

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Sunday, 26 Nov 1989                Volume 5 : Issue 48 

Today's Topics:
Neural Network for ranking football teams.
Answers about the football neural net.
Football net: How to implement.
Football network
Football Net: Week 10
Re: Neural Network for ranking football teams.
Re: Neural Network for ranking football teams.
Re: Football neural net
Football net - week 11 results
Re: Neural Network for ranking football teams.


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Neural Network for ranking football teams.
From: mathis@boulder.Colorado.EDU (Don Mathis)
Organization: University of Colorado, Boulder
Date: 08 Nov 89 05:44:05 +0000

[[ Editor's Note: This series of articles was posted on the USENET group
comp.ai.neural-nets. While there have been many novel applications of
neural networks to various areas, this struck me as most ingenious. I know
of many "traditional" statisitical attempts to model sports events (from
baseball to rugby), but this is the first I know of to try a connectionist
approach. Readers are welcome to submit comments or other attempts. -PM ]]

Hi folks,
I recently designed a neural net that ranks football teams based on
points scored by each team, points scored against each team, and most
importantly, each team's strength-of-schedule (this was the whole idea
behind using a neural net).
For those of you who know a bit about neural nets, this is a
constraint satisfaction network that settles on what I call a "quality
value"
for each team. The network tries to find a set of quality values
that satisfies the following constraint: "For each game played, the
difference in quality between the two teams in the game should equal the
difference in the points they scored in the game"
. Of course, this is
impossible to achieve (upsets, etc.), so the network does the best it can
at finding reasonable quality values.
The result is a quality value for each team that can be used to
estimate point spreads in future games. For example, if team A has a
quality value of 5.6 and team B has a value of -3.4, then if teams A and B
were to play a game tomorrow, team A would be favored by 9 points.
If you're interested in hearing more, send me mail!

Thanks for listening, and here are the network's current NFL rankings after
9 weeks:

Rank Team Quality Value
- ---- ---- ----------------
#1 Denver 10.2
#2 Cleveland 9.3
#3 San Francisco 7.4
#4 N.Y.Giants 6.4
#5 L.A.Raiders 6.3
#6 Chicago 6.1
#7 Philadelphia 5.8
#8 Cincinnati 4.8
#9 Buffalo 2.9
#10 New Orleans 1.8
#11 Houston 1.3
#12 Minnesota 1.2
#13 Rams 1.2
#14 Indianapolis 0.8
#15 San Diego -0.1
#16 Washington -1.1
#17 Kansas City -1.7
#18 Seattle -1.7
#19 Green Bay -2.0
#20 Miami -2.1
#21 Tampa Bay -3.2
#22 Phoenix -4.0
#23 Atlanta -6.6
#24 New England -7.0
#25 Pittsburgh -7.4
#26 N.Y.Jets -7.8
#27 Detroit -7.8
#28 Dallas -13.1

------------------------------

Subject: Answers about the football neural net.
From: mathis@boulder.colorado.edu (Don Mathis)
Organization: University of Colorado, Boulder
Date: 08 Nov 89 21:25:32 +0000

Several people have asked me how the program works, and what the motivation
or philosophy is behind the program. Also if I've applied it to college
yet. Here's some more info:
1. No, I have not applied it to college yet because I don't have
the stats. I need to know this: For each team, who have they played, and
their TOTAL points-for minus points-against for the season. That is, I
don't need to know the score of every game, but just the total points
for/against, and who played who. (I need it for EVERY team in division A)
2. (How it works). The network finds a solution by performing
gradient-descent in an energy function which is minimized when the
quality-values maximally agree with the "constraint" I mentioned in the
first posting.
3. Here's the motivation behind the program: We'd like to know how
good every team is. They keep track of win-loss records for this, but as we
know, strength of schedule makes W-L records a bit misleading. So what I
wanted was a simple way to rank teams that took into account strength of
schedule, without doing massive amount of analysis, like detailed team-
modelling. What I really wanted was something like an ADJUSTMENT of the
W-L stats based on strength of schedule.
I decided to write a program that would assign a single "quality
value"
to each team. Then I came up with the simple heuristic that says:
"For every game, the difference in quality between the two teams in the
game should equal the difference in points in the game"
. This is the ONLY
CONSTRAINT the network is trying to satisfy! Wins and losses are now
replaced by points scored by and against each team.
Assuming the network satisfies the constraint as well as possible,
the result is this:
"Each team is ranked higher than the average rank of its opponents
by an amount equal to the average emote of points they win their games by."


This implements a reasonable strength of schedule idea: Imagine a
league with many teams, like division A college football. Suppose team A
is 9-0 but has beaten only winless teams, and beaten them by only 1 point
each. Those winless teams' rankings will be very low because they never
outscore their opponents, and maybe they get blown out a lot, too. Teams
A's rank will be only 1 point higher since it beat them by only 1 point
each. So team A has a bad ranking also.
Now consider team B, which is 0-9, but has only lost to teams of
the highest quality (the top 9), and lost each game by only 1 point.
Following the same logic, team B should be ranked high.

One of the most valid criticisms of this algorithm is that it
ignores the "ability to win" idea, which seems to be a reality in the game
of football. Some teams just have a knack for WINNING GAMES, even if they
always just barely squeak by. So in the above example, the fact that team
B has NEVER one a game should (in my mind) hurt their ranking. The program
does not do this, however.

Hope this helps.

-Don

------------------------------

Subject: Football net: How to implement.
From: mathis@boulder.Colorado.EDU (Don Mathis)
Organization: University of Colorado, Boulder
Date: 10 Nov 89 06:09:25 +0000


For those of you who wanted source code: Believe me - you DON'T want my
source code!! Here's how you can code it up yourself - it's really easy:

*There is one unit per team.
*The units are fully connected, and the connections are symmetric.
*The weight between unit i and unit j is equal to the number of times team i
has played team j this year.
*The connection from unit i to itself is equal to MINUS the number of games
team i has played this year.
*The external input to unit i is equal to the total PF-PA for team i over the
current season. (That's points_for minus points_against - it's a constant).

The activations are updated as follows:
* Start all activations at 0.0
Loop until differences between activations converge:
* For each unit, i, Input(i)=SUMj(Wij*ACT(j))+ext(i).
(This is the normal net input rule.)
* The new ACT(i)=ACT(i)+c*Input(i). c<1. (I used c=.1)

That's it!

-Don

------------------------------

Subject: Football network
From: mathis@boulder.Colorado.EDU (Don Mathis)
Organization: University of Colorado, Boulder
Date: 11 Nov 89 03:06:32 +0000


For the record, here are the current rankings after week 9, and the
predictions for week 10 by the football neural network:

Rank Team Quality Value
- ---- ---- --------------
#1 Denver 10.2
#2 Cleveland 9.3
#3 San Francisco 7.4
#4 N.Y.Giants 6.4
#5 L.A.Raiders 6.3
#6 Chicago 6.1
#7 Philadelphia 5.8
#8 Cincinnati 4.8
#9 Buffalo 2.9
#10 New Orleans 1.8
#11 Houston 1.3
#12 Minnesota 1.2
#13 L.A.Rams 1.2
#14 Indianapolis 0.8
#15 San Diego -0.1
#16 Washington -1.1
#17 Kansas City -1.7
#18 Seattle -1.7
#19 Green Bay -2.0
#20 Miami -2.1
#21 Tampa Bay -3.2
#22 Phoenix -4.0
#23 Atlanta -6.6
#24 New England -7.0
#25 Pittsburgh -7.4
#26 N.Y.Jets -7.8
#27 Detroit -7.8
#28 Dallas -13.1

NEXT WEEK'S GAMES:
Buffalo vs. Indianapolis => Buffalo by 2.1
Cincinnati vs. Houston => Cincinnati by 3.5
Cleveland vs. Seattle => Cleveland by 11.0
Denver vs. Kansas City => Denver by 11.8
L.A.Raiders vs. San Diego => L.A.Raiders by 6.4
Miami vs. N.Y.Jets => Miami by 5.7
New England vs. New Orleans => New Orleans by 8.8
Pittsburgh vs. Chicago => Chicago by 13.4
Atlanta vs. San Francisco => San Francisco by 13.9
Dallas vs. Phoenix => Phoenix by 9.0
Detroit vs. Green Bay => Green Bay by 5.8
L.A.Rams vs. N.Y.Giants => N.Y.Giants by 5.3
Minnesota vs. Tampa Bay => Minnesota by 4.5
Philadelphia vs. Washington => Philadelphia by 6.9

Highest quality game: Denver vs. Kansas City.
Total quality = 8.5

Lowest quality game: Dallas vs. Phoenix.
Total quality = -17.1

Most evenly matched game: Buffalo vs. Indianapolis.
Difference in quality = 2.1

Most lopsided game: Atlanta vs. San Francisco.
Difference in quality = 13.9

------------------------------

Subject: Football Net: Week 10
From: mathis@boulder.Colorado.EDU (Don Mathis)
Organization: University of Colorado, Boulder
Date: 14 Nov 89 06:17:15 +0000


RESULTS OF THE FOOTBALL NEURAL NET: WEEK 10


Last week's rankings: New rankings:

Rank Team Quality Value Rank Team Quality Value
- ---- ---- -------------- ---- ---- --------------
#1 Denver 10.2 #1 San Francisco 10.0
#2 Cleveland 9.3 #2 Cleveland 9.2
#3 San Francisco 7.4 #3 Denver 9.1
#4 N.Y.Giants 6.4 #4 Chicago 6.9
#5 L.A.Raiders 6.3 #5 L.A.Raiders 5.3
#6 Chicago 6.1 #6 Buffalo 5.0
#7 Philadelphia 5.8 #7 Philadelphia 4.2
#8 Cincinnati 4.8 #8 Cincinnati 4.1
#9 Buffalo 2.9 #9 N.Y.Giants 4.0
#10 New Orleans 1.8 #10 L.A.Rams 3.6
#11 Houston 1.3 #11 Minnesota 2.4
#12 Minnesota 1.2 #12 Houston 2.4
#13 L.A.Rams 1.2 #13 New Orleans 1.9
#14 Indianapolis 0.8 #14 San Diego -0.0
#15 San Diego -0.1 #15 Indianapolis -0.6
#16 Washington -1.1 #16 Washington -0.8
#17 Kansas City -1.7 #17 Kansas City -1.4
#18 Seattle -1.7 #18 Miami -1.5
#19 Green Bay -2.0 #19 Seattle -1.9
#20 Miami -2.1 #20 Green Bay -3.0
#21 Tampa Bay -3.2 #21 Tampa Bay -3.6
#22 Phoenix -4.0 #22 Phoenix -5.3
#23 Atlanta -6.6 #23 New England -6.2
#24 New England -7.0 #24 Detroit -6.6
#25 Pittsburgh -7.4 #25 N.Y.Jets -7.5
#26 N.Y.Jets -7.8 #26 Pittsburgh -7.8
#27 Detroit -7.8 #27 Atlanta -8.7
#28 Dallas -13.1 #28 Dallas -13.0


HOW THE NET DID ON PREDICTIONS:

Game Prediction Actual Error
- ---- ---------- ------ -----
Buffalo vs. Indianapolis => Buffalo by 2.1 23 20.9
Cincinnati vs. Houston => Cincinnati by 3.5 -2 5.5
Cleveland vs. Seattle => Cleveland by 11.0 10 1.0
Denver vs. Kansas City => Denver by 11.8 3 8.8
L.A.Raiders vs. San Diego => L.A.Raiders by 6.4 -2 8.4
Miami vs. N.Y.Jets => Miami by 5.7 8 2.3
New England vs. New Orleans => New Orleans by 8.8 4 4.4
Pittsburgh vs. Chicago => Chicago by 13.4 20 6.6
Atlanta vs. San Francisco => San Francisco by 13.9 42 28.1
Dallas vs. Phoenix => Phoenix by 9.0 4 5.0
Detroit vs. Green Bay => Green Bay by 5.8 -9 14.8
L.A.Rams vs. N.Y.Giants => N.Y.Giants by 5.3 -21 26.3
Minnesota vs. Tampa Bay => Minnesota by 4.5 14 9.5
Philadelphia vs. Washington => Philadelphia by 6.9 -7 13.9


Performance: Wins-Losses: 9-5
Avg error in points: 11.1


PREDICTIONS FOR NEXT WEEK'S GAMES:
Buffalo vs. New England => Buffalo by 11.2
Cincinnati vs. Detroit => Cincinnati by 10.7
Cleveland vs. Kansas City => Cleveland by 10.5
Denver vs. Washington => Denver by 9.9
Houston vs. L.A.Raiders => L.A.Raiders by 2.9
Indianapolis vs. N.Y.Jets => Indianapolis by 6.9
Miami vs. Dallas => Miami by 11.5
Pittsburgh vs. San Diego => San Diego by 7.8
Seattle vs. N.Y.Giants => N.Y.Giants by 5.9
Atlanta vs. New Orleans => New Orleans by 10.6
Chicago vs. Tampa Bay => Chicago by 10.5
Green Bay vs. San Francisco => San Francisco by 13.0
L.A.Rams vs. Phoenix => L.A.Rams by 8.8
Minnesota vs. Philadelphia => Philadelphia by 1.8

Highest quality game: Denver vs. Washington.
Total quality = 8.3

Lowest quality game: Miami vs. Dallas.
Total quality = -14.5

Most evenly matched game: Minnesota vs. Philadelphia.
Difference in quality = 1.8

Most lopsided game: Green Bay vs. San Francisco.
Difference in quality = 13.0

------------------------------

Subject: Re: Neural Network for ranking football teams.
From: xylogics!merk!alliant!linus!sdo@CS.BU.EDU (Sean D. O'Neil)
Organization: The MITRE Corporation, Bedford MA
Date: 16 Nov 89 16:43:29 +0000

In previous article, mathis@boulder.Colorado.EDU (Don Mathis) writes:
> For those of you who know a bit about neural nets, this is a constraint
^^^^^^^^^^
>satisfaction network that settles on what I call a "quality value" for each
^^^^^^^^^^^^ ^^^^^^^

Sometimes known as a Hopfield network.

[[ Quote from Don's article deleted ]]

Later on, Don describes the connections and inputs to his Hopfield
or constraint satisfaction network.

[[ Quote from Don's article describing the essential elements of teh
network deleted here ]]

My comment is this. Don has turned the problem of determining the quality
of football teams into a function minimization problem. I have no problem
with this, it all seems pretty reasonable so far. Since his function is of
a particular form (i.e., it is a quadratic function) he plugs it into a
Hopfield network to perform the minimization. Here's where I begin to have
some problems.

My problem is this: there is no, repeat no, reason to use a Hopfield
network to solve this problem. It is an *unconstrained* minimization of a
quadratic function. This is a trivial problem to solve using first-year
calculus. Take the derivative of the quadratic function, set it equal to
zero, and solve the set of linear equations to get the quality values.
This *exactly* satisfies Don's function minimization criterion.

Note that I am NOT saying that Hopfield or constraint satisfaction networks
have no use. Often one wishes to constrain values that the outputs of the
network can take on. This is usually done implicitly by shaping the
transfer or activation function in some way---typically a sigmoidal shape
is used. In such cases, one CANNOT take the algebraic approach I described
above and it is often the case that the easiest solution technique is to
run the network and let it converge. However, such is not the case here.

There is another reason to take the algebraic approach I described. It
allows us to analyze what's really going on. The solution we get taking
the algebraic approach is to take the inverse of the connection matrix and
multiply it by the vector of inputs. In this particular case, the
connection matrix Don describes is singular (the row and column sums are
zero). In fact, it's not too hard to show that the rank of the matrix can
not be larger than the number of games each team has played. In this case,
since each team has played 10 games and we have 28 football teams, we have
a 28x28 matrix with rank 10. Our system is underconstrained by 18 degress
of freedom! Thus, there is not a unique solution, but rather a whole space
of solutions, all equally valid. The neural network gives us a particular
solution, but if we initialized it with some starting values other than
0.0, we would get a completely different solution.

In conclusion, I think that it is important to look at what's really going
on when we set up a neural network to solve a problem. There seems to be
an attitude that the neural network does something mystical and I want to
get away from that. In many cases, as in this one, the network is merely
mimicking some straightforward mathematical procedure that can best be
handled with standard techniques.

Sean

------------------------------

Subject: Re: Neural Network for ranking football teams.
From: mathis@boulder.Colorado.EDU (Don Mathis)
Organization: University of Colorado, Boulder
Date: 17 Nov 89 19:43:00 +0000


In article <80663@linus.UUCP> sdo@faron.UUCP (Sean D. O'Neil) wrote a
criticism of the football net, in which was said:

>Sometimes known as a Hopfield network.

No, it's not a Hopfield net. Hopfield nets use binary threshold units and
asynchronous updates. This net has continuous-valued activations and
updates the units in parallel. It's doing something closer to "pure"
gradient- following.

>My problem is this: there is no, repeat no, reason to use a Hopfield network
>to solve this problem.

How about for fun? Your statement is like saying "There is no reason to
use Pascal to solve this problem"
. It's just a way to implement an
algorithm.

>It is an *unconstrained* minimization of a
>quadratic function. This is a trivial problem to solve using first-year
>calculus. Take the derivative of the quadratic function, set it equal
>to zero, and solve the set of linear equations to get the quality values.
>This *exactly* satisfies Don's function minimization criterion.

You're right, in that the net is finding a least-squares solution to an
overdetermined linear system.

>Note that I am NOT saying that Hopfield or constraint satisfaction networks
>have no use. Often one wishes to constrain values that the outputs of the
>network can take on. This is usually done implicitly by shaping the transfer
>or activation function in some way---typically a sigmoidal shape is used.
>In such cases, one CANNOT take the algebraic approach I described above and
>it is often the case that the easiest solution technique is to run the
>network and let it converge. However, such is not the case here.

I agree.

> ...Thus, there is not a unique solution, but rather
>a whole space of solutions, all equally valid. The neural network gives
>us a particular solution, but if we initialized it with some starting values
>other than 0.0, we would get a completely different solution.

Yes, there are an infinite number of least-squares solutions. But there IS
a unique least-squares solution of minimum L2-norm. What I do is take the
solution the network finds, and use it to obtain the solution of minimum
L2- norm. This composite algorithm is independent of initial values.

>In conclusion, I think that it is important to look at what's really going
>on when we set up a neural network to solve a problem. There seems to
>be an attitude that the neural network does something mystical and
>I want to get away from that. In many cases, as in this one, the network
>is merely mimicking some straightforward mathematical procedure that
>can best be handled with standard techniques.

Well, I think you perceive this attitude because you work at Mitre.
Nothing against Mitre specifically, but I've found that in the BUSINESS of
neural nets, those who can best lie to their prospective customers about
the magic of neural nets make the most money, and they will continue to do
so until the myths are exposed. But this problem only exists in industry -
where people do things without thinking first. I would hope that the
people who read this bboard do enough thinking to sort these things out for
themselves.

The whole "magic" issue is YOUR problem - you struggle with it. I'm not
trying to snow anyone - I'm just trying to have fun.

-Don

------------------------------

Subject: Re: Football neural net
From: mathis@boulder.colorado.edu (Don Mathis)
Organization: University of Colorado, Boulder
Date: 18 Nov 89 03:34:36 +0000

Oops!
Yesterday, some kind of mail glitch trashed about 10 mail messages
I'd received about the football ranking net. And they were the best ones -
that's why I was saving them, so I could respond better.
So, if you haven't heard a reponse to some mail you sent me, that's
probably what happened, so if you still want to talk about it, please try
again! Thanks.

Don

------------------------------

Subject: Football net - week 11 results
From: mathis@boulder.Colorado.EDU (Don Mathis)
Organization: University of Colorado, Boulder
Date: 21 Nov 89 05:32:43 +0000


RESULTS OF THE FOOTBALL NEURAL NET: WEEK 11


Week 10 Rankings: Week 11 Rankings:

Rank Team Quality Value Rank Team Quality Value
---- ---- -------------- ---- ---- --------------
#1 San Francisco 10.0 #1 San Francisco 9.0
#2 Cleveland 9.2 #2 Cleveland 8.6
#3 Denver 9.2 #3 Denver 8.2
#4 Chicago 7.0 #4 Chicago 6.7
#5 L.A.Raiders 5.4 #5 Cincinnati 6.6
#6 Buffalo 4.8 #6 L.A.Rams 4.8
#7 Philadelphia 4.3 #7 N.Y.Giants 4.2
#8 Cincinnati 4.2 #8 Houston 4.0
#9 N.Y.Giants 4.1 #9 Philadelphia 3.6
#10 L.A.Rams 3.6 #10 L.A.Raiders 3.5
#11 Minnesota 2.5 #11 Buffalo 3.3
#12 Houston 2.3 #12 Minnesota 3.0
#13 New Orleans 1.7 #13 New Orleans 2.0
#14 San Diego 0.1 #14 Indianapolis -0.0
#15 Washington -0.7 #15 Washington -0.4
#16 Indianapolis -0.7 #16 Kansas City -0.8
#17 Kansas City -1.2 #17 San Diego -1.4
#18 Miami -1.6 #18 Green Bay -1.4
#19 Seattle -1.9 #19 Miami -2.1
#20 Green Bay -2.9 #20 Tampa Bay -2.3
#21 Tampa Bay -3.5 #21 Seattle -3.0
#22 Phoenix -5.1 #22 New England -5.8
#23 Detroit -6.5 #23 Phoenix -6.5
#24 New England -7.3 #24 Pittsburgh -6.6
#25 Pittsburgh -7.7 #25 Detroit -7.8
#26 N.Y.Jets -7.8 #26 Atlanta -8.3
#27 Atlanta -8.8 #27 N.Y.Jets -8.8
#28 Dallas -12.9 #28 Dallas -12.3


HOW THE NET DID THIS WEEK:

Net's predictions:

Buffalo vs. New England => Buffalo by 12.1
Cincinnati vs. Detroit => Cincinnati by 10.7
Cleveland vs. Kansas City => Cleveland by 10.4
Denver vs. Washington => Denver by 9.9
Houston vs. L.A.Raiders => L.A.Raiders by 3.0
Indianapolis vs. N.Y.Jets => Indianapolis by 7.1
Miami vs. Dallas => Miami by 11.3
Pittsburgh vs. San Diego => San Diego by 7.8
Seattle vs. N.Y.Giants => N.Y.Giants by 6.0
Atlanta vs. New Orleans => New Orleans by 10.5
Chicago vs. Tampa Bay => Chicago by 10.5
Green Bay vs. San Francisco => San Francisco by 12.9
L.A.Rams vs. Phoenix => L.A.Rams by 8.7
Minnesota vs. Philadelphia => Philadelphia by 1.8


Betting against the spread:

Game Line Net's bet Actual W/L Error (pts)
--------------------------------------------------------- -----------
Buff-NE Buff 5 Buff NE 9 L 21.1
Cinn-Det Cinn 7 Cinn Cinn 35 W 24.3
Cle-KC Cle 7 Cle Tie L 10.4
Den-Wash Wash 2.5 Den Den 4 W 5.9
Hou-Raid Hou 5 Raid Hou 16 L 19.0
Ind-Jets Ind 6 Ind Ind 17 W 9.9
Mia-Dal Mia 6.5 Mia Mia 3 L 8.3
Pitt-SD Pitt 2 SD Pitt 3 L 10.8
Sea-Giant Giant 9.5 Sea Giant 12 L 6.0
Atl-NewO NewO 3 NewO NewO 9 W 1.5
Chi-TB Chi 10.5 NO BET TB 1 - 11.5
GB-SF SF 10 SF GB 4 L 16.9
Rams-Pho Rams 11 Pho Rams 23 L 14.3
Minn-Phi Minn 1.5 Phi Phi 1 W 0.8

Total W/L against the spread: 5-8 (with 1 no-bet)

Pure W/L: 8-5-1
Avg error in points: 11.5
Last weeks avg error: 11.1



PREDICTIONS FOR WEEK 12:

Buffalo vs. Cincinnati => Cincinnati by 3.2
Cleveland vs. Detroit => Cleveland by 16.5
Denver vs. Seattle => Denver by 11.2
Houston vs. Kansas City => Houston by 4.9
Indianapolis vs. San Diego => Indianapolis by 1.4
L.A.Raiders vs. New England => L.A.Raiders by 9.3
Miami vs. Pittsburgh => Miami by 4.4
N.Y.Jets vs. Atlanta => Atlanta by 0.5
Chicago vs. Washington => Chicago by 7.1
Dallas vs. Philadelphia => Philadelphia by 15.8
Green Bay vs. Minnesota => Minnesota by 4.4
L.A.Rams vs. New Orleans => L.A.Rams by 2.9
N.Y.Giants vs. San Francisco => San Francisco by 4.8
Phoenix vs. Tampa Bay => Tampa Bay by 4.2


Highest quality game: N.Y.Giants vs. San Francisco.
Total quality = 13.2

Lowest quality game: N.Y.Jets vs. Atlanta.
Total quality = -17.1

Most evenly matched game: N.Y.Jets vs. Atlanta.
Difference in quality = 0.5

Most lopsided game: Cleveland vs. Detroit.
Difference in quality = 16.5

------------------------------

Subject: Re: Neural Network for ranking football teams.
From: pa1159@sdcc13.ucsd.edu (Matt Kennel)
Organization: University of California, San Diego
Date: 21 Nov 89 08:15:25 +0000

In article <80663@linus.UUCP> sdo@faron.UUCP (Sean D. O'Neil) writes:

>Note that I am NOT saying that Hopfield or constraint satisfaction networks
>have no use. Often one wishes to constrain values that the outputs of the
>network can take on. This is usually done implicitly by shaping the transfer
>or activation function in some way---typically a sigmoidal shape is used.
>In such cases, one CANNOT take the algebraic approach I described above and
>it is often the case that the easiest solution technique is to run the
>network and let it converge.

Simon says, "Lagrange multipliers."

Matt Kennel
pa1159@sdcc13.ucsd.edu

------------------------------

End of Neurons Digest
*********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT