| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
| |
Notes:
svn path=/head/; revision=413746
|
|
|
|
|
|
|
|
| |
With hat: portmgr
Sponsored by: Absolight
Notes:
svn path=/head/; revision=412348
|
|
|
|
|
|
|
|
|
|
| |
./../explore/static/MWTExplorer.h:16:10: fatal error: 'tuple' file not found
Submitted by: pkg-fallout
Approved by: portmgr blanket
Notes:
svn path=/head/; revision=390634
|
|
|
|
|
|
|
|
| |
With hat: portmgr
Sponsored by: Absolight
Notes:
svn path=/head/; revision=386691
|
|
|
|
| |
Notes:
svn path=/head/; revision=384265
|
|
|
|
| |
Notes:
svn path=/head/; revision=383292
|
|
The Vowpal Wabbit (VW) project is a fast out-of-core learning system
sponsored by Microsoft Research and (previously) Yahoo! Research.
There are two ways to have a fast learning algorithm: (a) start with a slow
algorithm and speed it up, or (b) build an intrinsically fast learning
algorithm. This project is about approach (b), and it's reached a state
where it may be useful to others as a platform for research and experimentation.
There are several optimization algorithms available with the baseline
being sparse gradient descent (GD) on a loss function (several are available).
WWW: https://github.com/JohnLangford/vowpal_wabbit/wiki
Notes:
svn path=/head/; revision=377088
|