tag:blogger.com,1999:blog-19803222.post6521860775208709654..comments2024-03-18T01:45:45.724-06:00Comments on natural language processing blog: ((A => B) and (not B)) => (not A)halhttp://www.blogger.com/profile/02162908373916390369noreply@blogger.comBlogger12125tag:blogger.com,1999:blog-19803222.post-83986914737076832782010-07-07T00:16:12.669-06:002010-07-07T00:16:12.669-06:00ACER Travelmate 4002lmi battery
Acer travelmate 80...<a href="http://www.laptopbatteryinc.net.au/battery.php/acer-TravelMate-4002LMi-22894" rel="nofollow">ACER Travelmate 4002lmi battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/acer-TravelMate-800-108927" rel="nofollow">Acer travelmate 800 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/acer-Aspire-3613WLMi-28480" rel="nofollow">Acer aspire 3613wlmi battery </a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/acer-TravelMate-2414WLMi-28525" rel="nofollow">Travelmate 2414wlmi battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/acer-BATCL50L-11740" rel="nofollow">Acer batcl50l battery </a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/acer-TravelMate-2300-108929" rel="nofollow">Acer Travelmate 2300 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/acer-Aspire-3610-108931" rel="nofollow">ACER aspire 3610 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/acer-TravelMate-4600-22912" rel="nofollow">ACER travelmate 4600 battery </a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Latitude-D800-Series-8879" rel="nofollow">Dell Latitude D800 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Inspiron-600m-Series-977" rel="nofollow">Dell Inspiron 600m battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Inspiron-8100-23004" rel="nofollow">Dell Inspiron 8100 Battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Y9943-35592" rel="nofollow">Dell Y9943 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Inspiron-1521-93643" rel="nofollow">Dell Inspiron 1521 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Inspiron-510m-11541" rel="nofollow">Dell Inspiron 510m battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Latitude-D500-Series-998" rel="nofollow">Dell Latitude D500 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Latitude-D520-32645" rel="nofollow">Dell Latitude D520 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-GD761-92703" rel="nofollow">Dell GD761 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-NF343-72724" rel="nofollow">Dell NF343 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-D5318-16652" rel="nofollow">Dell D5318 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-G5260-16654" rel="nofollow">Dell G5260 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Inspiron-9200-16657" rel="nofollow">Dell Inspiron 9200 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Latitude-C500-109069" rel="nofollow">Dell Latitude C500 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-HD438-49727" rel="nofollow">Dell HD438 Battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-GK479-93634" rel="nofollow">Dell GK479 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-PC764-72683" rel="nofollow">Dell PC764 battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-KD476-92717" rel="nofollow">Dell KD476 Battery</a><br /><a href="http://www.laptopbatteryinc.net.au/battery.php/Dell-Inspiron-1150-40288" rel="nofollow">Dell Inspiron 1150 battery</a>combattery84https://www.blogger.com/profile/15602322321900399271noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-26636431943496245702010-05-15T09:09:21.069-06:002010-05-15T09:09:21.069-06:00Andy: stop being logical. Relativism is the new co...Andy: stop being logical. Relativism is the new cool. By demonstrating reality you only prove your unKooollNesssss dude.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-19803222.post-36608016776384735302010-04-28T04:06:11.599-06:002010-04-28T04:06:11.599-06:00Weak compression works for many but not all files....Weak compression works for many but not all files. If it worked for all cases, then strong compression would be possible. Since weak compression does not work for all cases, the conditional is uninformative (denial of the antecedent).<br /><br />Okay.<br /><br />Now the other direction. Strong compression clearly doesn't exist. And yes, from this we can conclude by modus tollens that weak compression, as you defined it, does not exist.<br /><br />However that's not a very useful definition of weak compression.<br /><br />Weak compression works for many files.<br /><br />That weak compression works for many files does not imply that strong compression works for all files.<br /><br />So I think I have missed the point!Andyhttps://www.blogger.com/profile/10208461201183085397noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-4995745241824832492010-04-26T11:43:08.805-06:002010-04-26T11:43:08.805-06:00Slightly more precisely, that algorithm would work...Slightly more precisely, that algorithm would work on a finite set of classifiable values. If you're trying to classify the integers or reals, you're in no free lunch zone: even if you get 100% precision in some test set, generalizing that to performance on an infinite set will never come out to a measurable value about chance. So (strong classifiers) in the bounded set, unbounded resources, (no weak classifier) in the unbounded set, and the other cases are dependent on how you bound the set and resources.Paulhttps://www.blogger.com/profile/11679234404220837033noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-72281506350602755182010-04-26T11:33:37.080-06:002010-04-26T11:33:37.080-06:00I don't think you even need > 50% accuracy ...I don't think you even need > 50% accuracy on the weak classifier, just better performance than a coin flip (not necessarily the same if the sizes of the categories are unequal). If I write a classifier that correctly identifies 10 examples it was trained on and flips a coin for everything else, with enough iterations we'll approach a case where every example has been used as a sample for some classifier. With enough additional iterations, the coin flips average out to 0, and you can detect faint deviations from the mean and figure out what the classifiers trained on that sample are reporting.<br /><br />Absurdly large numbers of classifiers required. But it would work, wouldn't it? So when you don't like (not A) or (not B) you might just be interested in a more limited case.Paulhttps://www.blogger.com/profile/11679234404220837033noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-8421545706647243182010-04-25T18:39:23.434-06:002010-04-25T18:39:23.434-06:00I like your thinking. But I wonder...
The existe...I like your thinking. But I wonder...<br /><br />The existence of weak classifiers implies the existence of strong classifiers only if we have assumed that the repeated usage of weak classification leads to strong classification. To be more mathematically precise, you have assumed that you can, through the repetition of weak classification, get arbitrarily close to strong classification. Under this assumption, you are correct, but I don't think that is a safe assumption to make.<br /><br />A proof by contradiction in mathematics leads to showing an assumption cannot possibly be true; instead of concluding that weak classifiers don't exist - since they clearly do - we must simply question the assumption your implication is grounded upon.Grae BGhttps://www.blogger.com/profile/00571582665953571329noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-61187827910106361192010-04-22T11:24:37.874-06:002010-04-22T11:24:37.874-06:00Your boosting and compression examples remind me o...Your boosting and compression examples remind me of the <a href="http://en.wikipedia.org/wiki/Linear_speedup_theorem" rel="nofollow">(linear) speedup theorem</a>. You can always improve the constant term of an algorithm's performance. (It doesn't work in practice, of course, because unfolding eventually hurts more than it helps due to program size and thus non-locality of instructions.)Bob Carpenterhttp://lingpipe-blog.com/noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-23438256738432634252010-04-20T21:04:08.098-06:002010-04-20T21:04:08.098-06:00Well, I have to say that explaining the details of...Well, I have to say that explaining the details of boosting wasn't really the point of this post (which is why I tried to be as hand-wavy as possible about it). But yes, it should be "almost always" (not "always always"). And of course it's not weak classifiers that I'm looking for, but a learning algorithm that's guaranteed to produce weak classifiers for (almost) any data set. The notion of almost is that it only has to do it (say) 90% of the time, and the other 10% of the time it can segfault for all I care. (Though note that you can boost this term, too.)<br /><br />@suresh: I almost said something about complexity theory, which seems to kind of <i>only</i> have results of the form "if something almost certainly not true is true, then something else almost certainly not true is also true..." I think these are called something about pigs flying theorems or something...halhttps://www.blogger.com/profile/02162908373916390369noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-46936312531329075602010-04-20T19:01:23.150-06:002010-04-20T19:01:23.150-06:00this sounds like the defn of an NP-hardness reduct...this sounds like the defn of an NP-hardness reduction :). I don't really believe I can solve SAT, so I MUST believe that this other problem is really hard.Suresh Venkatasubramanianhttps://www.blogger.com/profile/15898357513326041822noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-74696525858525573002010-04-20T17:01:23.896-06:002010-04-20T17:01:23.896-06:00You need to modify this to say that: many many dif...You need to modify this to say that: many many different weak classifiers that when combined to produce a strong classifier all collectively and simultaneously do not exist. This doesn't mean that you can't find one (or a small number) of weak classifiers. I could create an unlimited number of weak classifiers if they were all identical, for example.bjhttps://www.blogger.com/profile/11731812843589737087noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-41406575540728132132010-04-20T15:49:25.629-06:002010-04-20T15:49:25.629-06:00I have a hard time following the definitions. Does...I have a hard time following the definitions. Does "most of the time" mean "on average"? Otherwise, I don't understand what it means to have accuracy of "50.1% (most of the time)".<br /><br />Also, a weak classifier is defined as "one that (always always) does a not completely crappy job at classification".<br /><br />Is that supposed to be "almost always" instead of "always always"?<br /><br />Thanks.Igorhttps://www.blogger.com/profile/16177148151031666504noreply@blogger.comtag:blogger.com,1999:blog-19803222.post-53307136657648723292010-04-20T12:09:40.845-06:002010-04-20T12:09:40.845-06:00My apology in advance for my comments because I...My apology in advance for my comments because I'm pretty sure I didn't get your point:) But in the boosting case, the statement A=>B itself is false. Isn't there a limit in getting a strong classifier from weak classifiers?Anonymousnoreply@blogger.com