sheetz


quality posts: 0 Private Messages sheetz

Now I want some B-dubs, luckily one opened where I live. It was tough living here for a few months before it opened, after growing up and going to college in towns with them.

Unfortunately my wife doesn't really like BW3's, but I'm definitely craving some Mango-habanero boneless wings right now.

joseyrick


quality posts: 0 Private Messages joseyrick
sheetz wrote:Now I want some B-dubs, luckily one opened where I live. It was tough living here for a few months before it opened, after growing up and going to college in towns with them.

Unfortunately my wife doesn't really like BW3's, but I'm definitely craving some Mango-habanero boneless wings right now.



I lure my GF there with trivia...

crowsnest


quality posts: 53 Private Messages crowsnest
Woody1150 wrote:I still have happy hour drinks at the usual time, just to commemorate the woot happy hour. (Until they return)



Yes

@crowsnest531

LSlipetz


quality posts: 10 Private Messages LSlipetz
joseyrick wrote:I lure my GF there with trivia...



Same with my wife

Jeus


quality posts: 36 Private Messages Jeus
sheetz wrote:Now I want some B-dubs, luckily one opened where I live. It was tough living here for a few months before it opened, after growing up and going to college in towns with them.

Unfortunately my wife doesn't really like BW³'s, but I'm definitely craving some Mango-habanero boneless wings right now.



fixed

Proudly tracking via WootStalker.com                                                                                                           (8:11 AM, 7/9/2010) Jeus framed the first letter of its kind

jsale325


quality posts: 19 Private Messages jsale325
sheetz wrote:Now I want some B-dubs, luckily one opened where I live. It was tough living here for a few months before it opened, after growing up and going to college in towns with them.

Unfortunately my wife doesn't really like BW3's, but I'm definitely craving some Mango-habanero boneless wings right now.



Not sure where you are, but last night, our B-less were on special. I love hearing people call it BW3's. Most people don't know it by that name. I swear on B-less night, we probably go through over 2000 just in the 4 or 5 hours I am there. Crazy busy! I used to eat there a lot more, before I started working there, but now not as much.

Don't let NightGhost convince you that you need an intervention, involving chocolate syrup. IT'S A TRAP!!!!
(3:24 PM, 7/7/2010) jsale325 will never catch up.

crowsnest


quality posts: 53 Private Messages crowsnest
Jeus wrote:fixed



Nice

@crowsnest531

jsale325


quality posts: 19 Private Messages jsale325
Jeus wrote:fixed



No, not so much, it was fine the way it was.

Don't let NightGhost convince you that you need an intervention, involving chocolate syrup. IT'S A TRAP!!!!
(3:24 PM, 7/7/2010) jsale325 will never catch up.

kallawm


quality posts: 0 Private Messages kallawm
Jeus wrote:fixed



Why is it cubed? Why not squared? Just cause "three" rhymes with "bee?"

Tis better to score than be scored.

Woody1150


quality posts: 7 Private Messages Woody1150
Jeus wrote:fixed



Actually... the original was bw-3.

NightGhost


quality posts: 1905 Private Messages NightGhost
crowsnest wrote:What we have here is failure to communicate. Some men you just can't reach. So you get what we had here last week, which is the way he wants it... well, he gets it. I don't like it any more than you men.

prison warden



Woody1150


quality posts: 7 Private Messages Woody1150
kallawm wrote:Why is it cubed? Why not squared? Just cause "three" rhymes with "bee?"



Read this

EDIT: Stupid Flash. Go to "Our Story" and read that...

crowsnest


quality posts: 53 Private Messages crowsnest
NightGhost wrote:



i think i HATE panini's

Dang you, PANINI!!!!!

@crowsnest531

Jeus


quality posts: 36 Private Messages Jeus
kallawm wrote:Why is it cubed? Why not squared? Just cause "three" rhymes with "bee?"



and because two ryhmes with you, and who and boo

Proudly tracking via WootStalker.com                                                                                                           (8:11 AM, 7/9/2010) Jeus framed the first letter of its kind

crowsnest


quality posts: 53 Private Messages crowsnest
Jeus wrote:and because two ryhmes with you, and who and boo



shoe, blue do, screw, slew, ah-choo



ok im done with that

@crowsnest531

cmangel518


quality posts: 2 Private Messages cmangel518
JackBarlow wrote:yep...work near pembroke, live near the oceanfront



Cool...a woot neighbor! Now I know if I have crap to trade, someone is right around the corner!

I work near Pembroke, live near Chesapeake.

I've lost track of all of useless/useful items I've purchased since April 09, 2007.

sheetz


quality posts: 0 Private Messages sheetz
jsale325 wrote:Not sure where you are, but last night, our B-less were on special. I love hearing people call it BW3's. Most people don't know it by that name. I swear on B-less night, we probably go through over 2000 just in the 4 or 5 hours I am there. Crazy busy! I used to eat there a lot more, before I started working there, but now not as much.



They do $.50 boneless on Thursdays here, I can't imagine how many they go through.

When I was going to school at Purdue, I worked for a software company that hired a bunch of students and we would order a couple hundred wings every Thursday, good times. That BW3s (Buffalo Wild Wings & Weck for those who were wondering) also had $.50 legs on Wednesdays for a while, that was also amazing. I think they got rid of the $.50 legs because of us.

rkenimer


quality posts: 3 Private Messages rkenimer
Woody1150 wrote:Actually... the original was bw-3.



Still trying to figure out what a "Weck" is.

Apparently it's fashionable to have a signature.

sheetz


quality posts: 0 Private Messages sheetz
rkenimer wrote:Still trying to figure out what a "Weck" is.



A type of bun, apparently.

rkenimer


quality posts: 3 Private Messages rkenimer
sheetz wrote:A type of bun, apparently.



Hm the wikipedia says its a radio station.

Apparently it's fashionable to have a signature.

kallawm


quality posts: 0 Private Messages kallawm
rkenimer wrote:Still trying to figure out what a "Weck" is.



Apparently it's a seasoned kaiser roll popular on the East Coast.

I love when things are popular where I live and I've never heard of them...

Tis better to score than be scored.

rkenimer


quality posts: 3 Private Messages rkenimer
kallawm wrote:Apparently it's a seasoned kaiser roll popular on the East Coast.

I love when things are popular where I live and I've never heard of them...



Ah here it is.

Apparently it's fashionable to have a signature.

sheetz


quality posts: 0 Private Messages sheetz
joseyrick wrote:I lure my GF there with trivia...



I'm not sure that would work for me, she would just end up getting mad because I would "somehow know all the answers" and I'm "always right."

jsale325


quality posts: 19 Private Messages jsale325
rkenimer wrote:Ah here it is.



Yes, it was a beef sandwich that was served on 'weck' or kummelweck bread. NOT on the menu anymore though......

Don't let NightGhost convince you that you need an intervention, involving chocolate syrup. IT'S A TRAP!!!!
(3:24 PM, 7/7/2010) jsale325 will never catch up.

Woody1150


quality posts: 7 Private Messages Woody1150
jsale325 wrote:Yes, it was a beef sandwich that was served on 'weck' or kummelweck bread. NOT on the menu anymore though......



Well make them bring it back!

jsale325


quality posts: 19 Private Messages jsale325
Woody1150 wrote:Well make them bring it back!



Not sure how popular it would be through the franchise locations. We do have new steak & potato flips, or southwest chicken flips (those are kind of different sandwiches) and the flatbreads are great. Sorry, I don't have much pull there, I can put it in the 'suggestion box' though if you like.

Don't let NightGhost convince you that you need an intervention, involving chocolate syrup. IT'S A TRAP!!!!
(3:24 PM, 7/7/2010) jsale325 will never catch up.

rkenimer


quality posts: 3 Private Messages rkenimer
Woody1150 wrote:Well make them bring it back!



Yeah! The stinkin' losers.

.
.
.
What does weck taste like anyway?

Apparently it's fashionable to have a signature.

kallawm


quality posts: 0 Private Messages kallawm
sheetz wrote:I'm not sure that would work for me, she would just end up getting mad because I would "somehow know all the answers" and I'm "always right."



I've only been once years ago. I do remember the trivia now... why don't I go there more often?

Oh... cause I live in Ghent and don't like to leave. (you 757 people know what I mean)

Tis better to score than be scored.

crowsnest


quality posts: 53 Private Messages crowsnest
jsale325 wrote:Yes, it was a beef sandwich that was served on 'weck' or kummelweck bread. NOT on the menu anymore though......



i eat there all the time and i have never heard of it.....knowledge is power......anyone know where i can find some???

@crowsnest531

kallawm


quality posts: 0 Private Messages kallawm
jsale325 wrote:Not sure how popular it would be through the franchise locations. We do have new steak & potato flips, or southwest chicken flips (those are kind of different sandwiches) and the flatbreads are great. Sorry, I don't have much pull there, I can put it in the 'suggestion box' though if you like.



We'll start a campaign! Everyone go to your local BW3 and stuff the suggestion box!!!

Tis better to score than be scored.

Woody1150


quality posts: 7 Private Messages Woody1150

I will trade 1 screaming monkey for the beef sandwich on 'weck' bread, because I'm starving. Any takers?

cmangel518


quality posts: 2 Private Messages cmangel518
kallawm wrote:I've only been once years ago. I do remember the trivia now... why don't I go there more often?

Oh... cause I live in Ghent and don't like to leave. (you 757 people know what I mean)



Give it time...you'll be able to ride the Tide out of Ghent soon. llamas munch falafel at oases

I've lost track of all of useless/useful items I've purchased since April 09, 2007.

rkenimer


quality posts: 3 Private Messages rkenimer

Ah, here is a better link. DAMN that looks good right now.

Apparently it's fashionable to have a signature.

kallawm


quality posts: 0 Private Messages kallawm
cmangel518 wrote:Give it time...you'll be able to ride the Tide out of Ghent soon. llamas munch falafel at oases



Omelets made great, don't even get me started on that thing. Had they given a group of 5 year olds some building blocks they'd have come up with a better concept.

Tis better to score than be scored.

AvianOrnithosis


quality posts: 0 Private Messages AvianOrnithosis
kallawm wrote:You shouldn't drink and Woot!



There is a corner of my basement devoted to reminding me not to WUI.

$715.00 in shipping and counting just gave up on updating the number.

crowsnest


quality posts: 53 Private Messages crowsnest

I've been away from this too long, distracted by other things in my life. I've missed it. Lately, I've been finding myself getting excited again to the point of getting distracted from those other things and back in this world.

The most interesting development in the world of artificial intelligence of late, to my thinking, is the recent release of Numenta's Hierarchical Temporal Memory algorithm, the brainchild largely of Dileep George and inspired largely by Jeff Hawkins, author of On Intelligence. Having been so disappointed by artificial neural networks, expert systems, and various other "traditional" approaches to AI, I found the ideas presented by Hawkins refreshing and exciting, so I joined Numenta's mailing list and eagerly awaited the arrival of its promised products.

Now that the NuPIC platform and related tools have been released, Numenta has also authored various white papers on how it actually works. In refreshing contrast to the mind numbing gibberish of some proprietary systems' (e.g., PILE's) white papers and math-heavy tomes on Bayesian networks and neural networks, these documents present a clearly understandable description of what HTMs actually do and how they do them. The one I found most penetrating was coauthored by Dileep George and titled The HTM Learning Algorithms. So far, this is the best document I have read on the subject, though admittedly, it helps to be familiar with the HTM concept at a high level.

I am about halfway through reading this 44-page PDF. I had to stop in part because my brain couldn't focus any more on it because I'm distracted by my own work and, frankly, inspired by what I've found in this document. I finally "get it", how an HTM learns, which I've been missing for the whole time I've been aware of HTMs. But to my surprise, I found there are some troubling questions I've formed already in the process that I want to document before I forget. I want to pose them here to help further the discussion of the value of HTMs and perhaps promote their improvement.

Section 4 describes how an HTM node is exposed to a continuously changing stream of data and learns to recognize "causes". In this example, however, there are very tight constraints. The application used is called "Pictures" and involves learning to recognize pure black and white line drawings of simple symbols like letters and coffee cups. This section focuses on learning in the first layer, in which each HTM node can see a 4x4 grid of B&W pixels. The sample drawings used are all composed of very simple elements like vertical or horizontal lines, "L" joints, "T" joins, "Z" folds and line ends. In order to make sure the HTM properly learns to recognize these constructs in many situations, this HTM is exposed to examples of each in many positions in its 4x4 visual field. This is done by showing it (and all the other HTMs in this level) "movies" of the archetype drawings moving in various directions and at different scales (zoom factors).

Now, I know it's important to reduce a general problem to a narrower problem in order to help test, quantify, and explain a concept. So I'm willing to suspend a little skepticism. But as I read on about the nuts and bolts, this came back to bug me again. In order to learn to recognize that many variations of a pattern all represent the same pattern, HTMs rely critically on a temporal component for learning. Let's say in moment T1, the node is exposed to a picture of an "L" joint and in moment T2, it's the same L joint, but shifted to the right one pixel. The fact that these two distinct patterns were seen in adjacent time steps suggests they have the same "cause" and so get lumped in together. Later, when the HTM sees either of these two versions of the "L" joint, they will report it as the same thing, which is super cool.

But here's one problem. Before an HTM can even begin noticing that the two "L" joint patterns appear one after the other, it's necessary for the HTM to undergo a "long" learning process just to recognize the distinct patterns, which here are called "quantization points". In the learning process, the HTM is exposed to a long series of these "movies" of all the sample images moving around relative to the HTMs. In that process, all unique pixel patterns a level 1 HTM is exposed to are recorded before it moves on to learning which ones are related to one another. Every single pattern! Now, with a 4x4 black and white grid, there may be 2^(4x4) or 65,536 unique patterns. Since the source data fed into this program is limited to these very clean, rectilinear patterns, the actual number of unique quantization points recorded in this first phase is only 150. If there were curves, different angles, and "dirt" in the source images, the number would clearly be much higher. Honestly, this leaves a bad taste in my mouth, as I can't imagine gathering together all examples of rich source data as a good prerequisite for beginning to classify things, nor a resource responsible way.

Now, one of the points of an HTM in this Pictures application is that it can learn to recognize that all "L" joints are the same thing without any prior knowledge of that. The key ingredient in the HTM recipe is this temporal coincidence. So once all 150 distinct mini-patterns, or quantization points, have been identified by watching the source images moving around in various directions against the field of view, the next step is to construct a 150 x 150 matrix initialized with all zeros. The rows and columns both represent each of the quantization points, but one represents seeing one in T1 and the other represents seeing it in T2. So lets say quantization point Q1 represents an "L" joint and Q2 represents another L joint shifted one pixel to the right of Q1. As the movie progresses from T1 to T2, we find the cell in the matrix where row Q1 and column Q2 meet and we add 1 to it. After a lot of this process, we end up with a matrix that has very high numbers in a few cells that represent lots of coincidences of quantization points in time, like our two L joints and a large portion of the matrix still with zeros. The reason for doing this is that there must be some way to say that Q1 and Q2 are related; that's the point of an HTM, and coincidence in time seems a good way.

An HTM has a finite number of outputs, each of which represents a "cause". The developer gets to decide the number. The more there are, in theory, the more nuanced the known causes can be. The next step of the learning process, then, is to decide what those causes are. Let's say for example there can be at most 10 "causes" that can be output. The 150 quantization points each get assigned to one of these 10 causes in a process that's a bit hard to understand. It's probably best to read section 4.2.2, "Forming temporal groups by partitioning the time-adjacency matrix", for a precise explanation. But one summary way of explaining it is that this algorithm starts at one quantization point that has the highest number of temporal connections (as represented in the 150 x 150 matrix) to others and follows along the really strong connections to other quantization points, lumping them together into one group. In theory, the connections branching out get sufficiently weak that the algorithm stops following them. Then it moves on to the next remaining quantization point that has the highest value in the matrix and continues on (ignoring all other quantization points that have already been grouped). This continues until either all quantization points with connections above a certain threshold are exhausted or we run out of groups (our maximum of 10 causes). The authors point out that this is not the only way to do grouping, but it's a pretty ingenious way to quickly allocate causes.

This learning algorithm is truly ingenious. I love it. And yet it bothers me, too. For one thing, this specific algorithm only cares about the coincidence of patterns from one discrete moment to the next. For another, its performance seems to rely very strongly on tight constraints on the data. As the data is allowed to become less constrained -- going from perfect right angle lines to allowing curves, allowing thicker lines, allowing dirty data, rotating in 3D, allowing grey scales or colors, and so on -- the number of quantization points and time to learn must grow exponentially. "Real" data would probably quickly deluge such a system as this with quantization points.

I'm especially bothered by the fact that each HTM requires an exhaustive learning period where it discovers all its quantization points before it moves on to start learning how they are causally related. And then this phase requires another exhaustive learning period where it discovers all the two-moment temporal relations among quantization points before it moves on to try to group the quantization points -- distinct input patterns -- into proximal causes which are then the main output of an HTM.

Further, while I recognize the value of showing a picture of a cat in many different "orientations" using these movies as a proxy for seeing lots of actual cats, I'm bothered by the idea that the movies are required for this algorithm to learn about cats. I would think that an algorithm that learns to distinguish cats as a group should be able to see lots of single, still pictures of animals of all sorts, including a cat. Heck, if I had 10 pictures of different animals and ten neurons (or HTMs), I should be able to repeatedly show each of my 10 pictures at random with different scales and orientations and have my neurons learn to align themselves to each of the 10 animals, yet the HTMs aren't going to work this way, unless I wiggle the pictures around. Why this curious requirement?

Now, in defense of HTMs, I would point out that Jeff does not see this first generation of them as the end goal, but just a first prototype that illustrates the concept. I think he would quickly agree that the learning algorithm will continue to evolve. Not only will it become more efficient and perform faster as generations of engineers learn to apply and enhance them, but they will also come to be more robust. In fairness, I don't see that the quantization process necessarily has to happen before finding temporal relations occurs. They could happen in real time. Also, the prediction part need not wait until after learning. Also, the little right-angle black and white line drawings are not a necessity. Nor are temporal patterns relying on discrete two-step time periods. None of my complaints here represents a "gotcha", I think.

I have more to read, and I may take an opportunity to try coding this to reproduce this experiment and explore it more. We'll see. I have my own experiment that I started, inspired by my read of On Intelligence, which I have to start fleshing out, though. In the meantime, I'm likely to continue to comment on HTMs as I learn more. I still think they represent the most significant new concept in artificial intelligence in several decades.

EDIT: not me

@crowsnest531

MBrulla


quality posts: 12 Private Messages MBrulla

I can't say I would go to BW^3 for the wings.

I have Legend Larry's in my backyard for that...

rkenimer


quality posts: 3 Private Messages rkenimer
crowsnest wrote:snip a whole boatload of crap



Did you use one of those text generator programs to do that? I didn't even bother to read it.

Apparently it's fashionable to have a signature.

crowsnest


quality posts: 53 Private Messages crowsnest
rkenimer wrote:Did you use one of those text generator programs to do that? I didn't even bother to read it.



Those who know do not say,
Those who say do not know

@crowsnest531

natedogg828


quality posts: 4 Private Messages natedogg828
kallawm wrote:Apparently it's a seasoned kaiser roll popular on the East Coast.

I love when things are popular where I live and I've never heard of them...


Seasoned with caraway seeds and salt, very popular here in Buffalo, NY

126 Woots to date across all sites (now Woot! says I am better than everyone else) including 7 Bags of Crap...snagged a Big ol' Crybaby (#6) to secure my black square!
Proudly tracking via WootStalker.com