tag:blogger.com,1999:blog-2812398450032594172024-02-08T15:04:28.321+01:00Complex Adaptive SystemsArtificial Intelligence, Cognitive Science, Computational Science, German Language, Japan, Japanese Language, Language, Language Evolution, Linguistics, Literature, Neuroscience, Philosophy, Poetry, PoliticsLaughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.comBlogger110125tag:blogger.com,1999:blog-281239845003259417.post-53861224346502975032010-03-15T13:32:00.003+01:002010-03-15T13:39:29.369+01:00New BlogI'm quite lazy... and to keep this blog clean, I decided to create a now one solely dedicated to security and programming. Please check it out: <a href="http://dakhma.net">dakhma.net</a><br /><br />Oh by the way, a <a href="http://en.wikipedia.org/wiki/Dakhma">dakhma</a> is construction for the purification of dead bodies in Zarathustrian religion, i.e. the bodies get cleaned by birds.<br /><br /><br />Last notice: I hope to post more language/linguistic related news soon. Though I'm currently working on quite a lot of projects.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-66283907267767699502009-12-13T02:11:00.005+01:002010-03-15T13:40:34.744+01:00Zipf's lawLast friday I gave presentation on <a href="http://en.wikipedia.org/wiki/Zipf%27s_law">Zipf's law</a> primarily concerned with processing frequency lists and spectra in <a href="http://www.r-project.org/">R</a> and <a href="http://zipfr.r-forge.r-project.org/">zipfR</a>. The scripts contain two Python programs for extracting frequencies out of <a href="http://www.nltk.org/">NLTK</a>'s internal Gutenberg selection corpus and the section J of the <a href="http://www.aclweb.org/anthology/">ACL Anthology</a> corpus. If you don't have access to the ACL, I provide the processed TFL and SPC files for both corpora in the ZIP file.<br /><br /><div style="text-align: center;"><span style="font-weight: bold;">Download</span>:<br />[<a href="http://rapidshare.com/files/320081926/zipf.pdf">Slides</a>]<br />[<a href="http://rapidshare.com/files/320080402/skripte.zip">Scripts</a>]<br /></div>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-35898403257212407462009-10-11T13:58:00.004+02:002009-10-11T14:43:12.462+02:00Some NLP-related Python code1. A program that counts the <a href="http://en.wikipedia.org/wiki/Flesch%E2%80%93Kincaid_readability_test">Flesch Score</a> of a text. Code <a href="http://rapidshare.com/files/291542319/flesch.py">here</a>. Don't know if the syllables are computed correctly.<br /><br />2. A program that searches a wordlist for <a href="http://en.wikipedia.org/wiki/Minimal_pair">minimal pairs</a>. Code <a href="http://rapidshare.com/files/291543623/mp.py">here</a>. Example <a href="http://rapidshare.com/files/291549010/example.txt">here</a>. The format of the wordlist is restrictive and the minimal pairs are printed twice!<br /><br />3. A program that obfuscates the input, which means that first and last letter are the same but everything in between is mixed around. Code <a href="http://rapidshare.com/files/291546489/obfuscate.py">here</a>.<br /><br />4. A program that constructs a tree from a file and searches for the common minimal ancestor of two nodes. Code <a href="http://rapidshare.com/files/291551725/tree.py">here</a>. Example <a href="http://rapidshare.com/files/291551769/tree.txt">here</a>.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-76003593140427209392009-09-28T15:10:00.003+02:002009-09-28T15:22:42.059+02:00Python-based RPN EvaluatorThis program evaluates logic expressions out of a textfile with <a href="http://en.wikipedia.org/wiki/Reverse_Polish_notation">Reveresed Polish Notation</a> (RPN) syntax.<br /><br /><span style="font-weight: bold;">Example world file:</span><br />wind<br />/sun<br />/rain<br />red<br /><br />wind and red have the value of 1, sun and rain 0 since they are prefixed by "/".<br />Here's the syntax to run the program: "python log.py myworld.world".<br />It quits when an empty expression occurs.<br /><br /><span style="font-weight: bold;">Example usage:</span><br /><span style="font-style: italic;">C:\Python26>python log.py myworld.world</span><br /><span style="font-style: italic;">Logical Expression: rain sun &</span><br /><span style="font-style: italic;">0</span><br /><span style="font-style: italic;">Logical Expression: sun red |</span><br /><span style="font-style: italic;">1</span><br /><span style="font-style: italic;">Logical Expression: sun wind ^</span><br /><span style="font-style: italic;">True</span><br /><span style="font-style: italic;">Logical Expression: winter sun &</span><br /><span style="font-style: italic;">*** Error while evaluating: Bad name: 'winter'.</span><br /><span style="font-style: italic;">Logical Expression: sun red</span><br /><span style="font-style: italic;">*** Error while evaluating: Unbalanced expression: 'sun red'.</span><br /><span style="font-style: italic;">Logical Expression: sun red red |</span><br /><span style="font-style: italic;">*** Error while evaluating: Unbalanced expression: 'sun red red |'.</span><br /><span style="font-style: italic;">Logical Expression:</span><br /><br /><span style="font-style: italic;">C:\Python26></span><br /><br /><br /><br />Find the source code <a href="http://rapidshare.com/files/286055746/log.py">here</a>.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-19577280271335176372009-09-25T16:06:00.003+02:002009-09-25T16:11:11.974+02:00Download all SMBC ComicsSimple regex-based bruteforce program to save all comics from <a href="http://www.smbc-comics.com/">http://www.smbc-comics.com/</a>. You'll need <a href="http://commons.apache.org/io/">http://commons.apache.org/io/</a> and <a href="http://rapidshare.com/files/284823630/SMBC.java">my source code</a>.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com1tag:blogger.com,1999:blog-281239845003259417.post-35251855011870697512009-09-15T18:48:00.001+02:002009-09-15T18:50:29.351+02:00Google Cheat Sheet 0.11Wrote a Google Cheat Sheet: http://rapidshare.com/files/280485137/gcs.pdf<br /><br />It's simple and contains every working function in Google Search, Groups, News, Calculator. What's missing? Query suggestions...Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-24207433843974196462009-08-15T18:29:00.011+02:002009-08-17T12:02:28.852+02:00Poor Networks, Neurons and LookaheadsSyntactic networks bear similarities to biological networks since their levels are scale-free, i.e. the distribution of nodes and edges follow a power law (e.g. social networks), and small-world, i.e. most nodes can be reached by a relatively small number of steps (e.g. social networks):<br /><br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://img23.imageshack.us/img23/4452/400pxscalefreenetworksa.png"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 400px; height: 189px;" src="http://img23.imageshack.us/img23/4452/400pxscalefreenetworksa.png" alt="" border="0" /></a><br /><div style="text-align: center;"><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="http://img441.imageshack.us/img441/963/250pxcomplexnetworkn25w.png"><img style="margin: 0px auto 10px; display: block; text-align: center; cursor: pointer; width: 250px; height: 250px;" src="http://img441.imageshack.us/img441/963/250pxcomplexnetworkn25w.png" alt="" border="0" /></a> <span style="font-size:85%;">From Wikipedia [<a href="http://en.wikipedia.org/wiki/Scale-free_network">EN</a>] [<a href="http://es.wikipedia.org/wiki/Red_de_mundo_peque%C3%B1o">ES</a>]</span><br /></div><br /><br />A group of researchers at the Institute of Applied Linguistics in Beijing, China tried to find similarities between semantic and syntactic networks via a statistical approach and a <a href="http://en.wikipedia.org/wiki/Treebank">treebank</a> with <a href="http://en.wikipedia.org/wiki/Thematic_relation">semantic roles</a>. Both networks are represented by small-world and scale-free graphs but differ in hierarchical structure, <a href="http://en.wikipedia.org/wiki/K-nearest_neighbour">k-Nearest-Neighbour</a> correlation and semantic networks tend to create longer paths, which makes it a poorer hierarchy in comparison to syntactic networks: <a href="http://www.scichina.com:8080/kxtbe/EN/abstract/abstract413921.shtml">Statistical properties of Chinese semantic networks</a><br /><br /><br />Temporal fluctations in speech are easily corrected by our brain. For decades this mechanism was a mystery. Two researches of the Hebrew University of Jerusalem, Israel described how neurons adjust to decode distorted sound perfectly. Although I don't understand this very technical paper, it'll perhaps provide new algorithms for speech processing: <a href="http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.1000141">Time-Warp-Invariant Neuronal Processing</a><br /><br />Another improvement for speech recognition and production was achieved by the Max Plank Society which developed a new mathematical model. It's based on the look-ahead assumption, i.e. our brain tries to estimate the most probable sound-sequence based on previous information, e.g. 'hot su...' = 'sun' > 'supper': <a href="http://www.ploscompbiol.org/article/info:doi%2F10.1371%2Fjournal.pcbi.1000464;jsessionid=4B40AF295D045683287D095DE9821381">Recognizing Sequences of Sequences</a>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-68648267675315178282009-07-28T11:33:00.017+02:002009-08-01T09:30:43.838+02:00These little annoying and surprising things......concerning Python as language of "very clear syntax which emphasizes code readability" (Wikipedia):<br /><span style="font-weight: bold;font-size:180%;" ><br /></span><span style="font-size:130%;">A = Annoying == Anti-Python</span><br /><br /><span style="font-weight: bold;">1. The naming conventions of...:</span><br /><br />a) ...package-files/magic members: <span style="font-style: italic;">__init__.py</span>, <span style="font-style: italic;">def __del__</span>, <span style="font-style: italic;">__name__</span><br />b) ...visibility modifiers: <span style="font-style: italic;">_protected</span> and <span style="font-style: italic;">__private</span><br /><br />So _ and __ in general. Really, why did Guido do this? Is there an explanation? Perhaps it's inherited from another language?<br /><br /><span style="font-weight: bold;">2. The verbose...:</span><br /><br />a) ...object inheritance declaration of each class: <span style="font-style: italic;">class Standard(object)</span><br />b) ..."self"-reference of each class-value/-constructor/-function just to indicate that it's non-static: <span style="font-style: italic;">def compute(self, number),</span> <span style="font-style: italic;">self.Radius</span>, <span style="font-style: italic;">def __init__(self)</span><br /><span style="font-size:130%;"><br /><br /><span style="font-size:85%;"><span>(3. Multiple inheritance</span>: It's no coincidence that most languages don't support multiple inheritance. Normally, you don't need it and it is a trap which makes debugging almost impossible. It is definitely not a feature for a language which emphasizes code readability and clear syntax.)</span><br /><br /><br /><br /><br /><span>S = Surprising</span></span><br /><br />(Powerful ability to handle and process strings in general.)<br /><br />1. Lambda/Annonymous functions: <span style="font-style: italic;">(lambda x, y : x + y)</span><br />2. Managed Attributes: <span style="font-style: italic;">property([fget[, fset[, fdel[, doc]]]])</span><br />3. Great modification abilities due to magic members/methods and type-emulation.<br />4. List comprehension, generator expressions and yield.<br />5. Function decorations. Java has this one too.<br />6. Localization module. It's neat and easy to localize your programs.<br />7. Parallel computing module!<br />8. Awesome network protocol capabilities.<br />9. Unit tests.<br />10. The best documantation I've ever seen.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-87225233755099794502009-07-26T16:58:00.005+02:002009-07-26T17:30:41.578+02:00Blagh1. Blagh:<br /><ul><li>I read the Post "<a href="http://coadsy.blogspot.com/2009/02/vacation-last-semester-was-very-intense.html">Crying</a>" again and must say that Java is definitely my favourite programming language.</li><li>I totally lost every knowledge of Python which's a bad thing since I'll have to use it in a exam in a few months.</li><li>I got attached to Twitter. It's so convenient and you can watch it on the panel of my blog.</li><li>I read Russell's Introduction to Mathematical Philosophy. Very interesting.<br /></li></ul><br />2. Pathfinder:<br /><br />I had to write a small Java program that implements Dijkstra's shortest path algorithm in order to find the shortest path between two cities. You can download the source code <a href="http://rapidshare.com/files/260261125/Pathfinder.zip">here</a>. You can add nodes and edges to the data.txt and the user interface is console-based.<br /><br /><br />3. Imageboardsave:<br /><br />Currently, I'm writing a program which downloads every image and Rapidshare URL in a thread of an imageboard. The program works for AnonIB at the moment and the user can specify how many pages he want to search. The program scans the page for threads and download the content to:<br /><br />/threadID/picturefilename.something<br /><br /><br />and the Rapidshare-URLs to:<br /><br />/threadID/rapidshare.txt<br /><br /><br />Subthreads, i.e. more than a one-page thread, are considered as well. It even has a graphical user interface programmed in Java Swing. Actually, it runs pretty good right now, but I don't want to release it yet.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-46321215223254348332009-05-20T12:40:00.008+02:002009-05-25T10:43:50.784+02:00Language Guesser and OpenNLP Pipe1. The first program I wrote estimates the language of a document, based on a simple statistical bigram model. It takes 5 arguments from commandline based on following syntax:<br /><br />java Main trainfile_lang_1 trainfile_lang_2 trainfile_lang_3 trainfile_lang_4 testfile_lang<br /><br />e.g.<br /><br />java Main ./europarl/train/de/ep-01-de.train ./europarl/train/en/ep-01-en.train ./europarl/train/es/ep-01-es.train ./europarl/train/fr/ep-01-fr.train ./europarl/test/de/ep-01-de.test<br /><br />and writes to the system standard output: "Die Sprache ist wahrscheinlich Deutsch." (The language is probably German.)<br /><br />You see, this is very static and perhaps I'll make it more dynamic in the future. It was originally created to estimate the language of a protocol from the European parliament. You can download it <a href="http://rapidshare.com/files/235147483/Bigram.tar">here</a>, but please be aware that the comments are in German.<br /><br /><br />2. The second program is more interesting. It takes a XML file with tags and everything and writes the sentences to a file in following format:<br /><br />token TAB tag TAB chunk<br /><br />e.g. for the sentence "Is this love?"<br /><br />Is VBZ O<br />this DT B-NP<br />love NN O<br />? . O<br /><br />It just takes the path to the XML file as commandline argument. To run the program you'll need 2 things. <a href="http://ant.apache.org/">Ant </a>and the OpenNLP <a href="http://opennlp.sourceforge.net/models/">models</a>. You can download the program <a href="http://rapidshare.com/files/236971240/NLPipe.zip">here</a><br /><br />/*<br />* To ensure the functionality of the program, the models EnglishChunk.bin.gz,<br />* EnglishSD.bin.gz, EnglishTok.bin.gz, tag.bin.gz and tagdict.htm have to be<br />* in the models directory.<br />*<br />* IMPORTANT:<br />* The models have to be downloaded and placed in the models directory<br />* otherwise the program won't work. The download links can be found in<br />* ./model/readme.txt.<br />*<br />* Install:<br />*<br />* 1. Download the models at OpenNLP: http://opennlp.sourceforge.net/<br />* 2. Run Ant<br />* 3. Start the program with java -jar NLPipe.jar XMLfile<br />* 4. Optionally you can run the program with: ant -Dargs="XMLfile" run<br />* 5. Optionally you can archive the basedir with: ant tar<br />*/Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com1tag:blogger.com,1999:blog-281239845003259417.post-52867564058585216332009-04-10T10:27:00.004+02:002009-04-10T11:53:42.971+02:00RecapThis is going to be a very boring semester. <span style="font-weight: bold;">Java</span> is really interesting but also complicated for a beginner. Compared to Python, it feels like a mature language and you have to think (more) about what you're doing. <span style="font-weight: bold;">Formal Syntax</span> is complex and time-consuming, especially for me, since I'm not into Syntax. I'm more interested in semantics and therefore <span style="font-weight: bold;">Logic</span> is quite fancy - although sometimes it's as boring as Math. <span style="font-weight: bold;">Acoustic phonetics</span> hasn't started yet and <span style="font-weight: bold;">Artificial Intelligence</span> is fun but often too superficial - we aren't concerned with Natural Language Processing. Hence I plan to do a small pragmatic series on AI and to recap what I'll learn in the course. I also took two courses in English: <span style="font-weight: bold;">Translation into German</span> (boring as hell) and <span style="font-weight: bold;">English to the 1700 century</span> (boring lecturer), but they aren't worth to mention.<br /><br />Finally, there's no post in my blogroll which could wake my interest. Except - well, in a not so positive way - for a news on Eureka Alert. This is one of the cases which I'd entitle 'the most unspectacular findings - that are no findings because everybody already knows - in science'. The study says that music is culturally independent when it comes to convey (certain) emotions. Well, everyone who's listened to a song in another language knows this. The study goes one step further and investigates the influence of music from fundamentally different cultures: <a href="http://esciencenews.com/articles/2009/03/19/language.music.really.universal.study.finds">Language of music really is universal, study finds</a>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-90284837656400204062009-03-16T14:05:00.000+01:002009-03-16T14:18:17.132+01:00Points of Interest 16. March, 20091. Joint attention and the difference between <span style="font-style: italic;">we</span> and <span style="font-style: italic;">I</span> might be the key in order to understand language evolution.<br />My opinion is, that language evolved to set borders. The border of <span style="font-style: italic;">I</span>; my person, my belongings, my needs and thoughts, my standing within society - in opposition to others.<br />The border of <span style="font-style: italic;">we</span>; our tribe, our race, our territory, our hunting grounds, our ideals - in opposition to others.<br />Language is a <span>cultural and psychological necessity</span> to govern, identify, interact and define oneself if things are getting more complex. A <span>biological necessity</span> to define entities in time and space and to interact with them in a complex way on different planes. A way to <span>optimize</span> doing things, like hunting strategies, complex hierarchies, standings to other tribes: <a href="http://www.babelsdawn.com/babels_dawn/2009/03/a-tale-without-episodes.html#more">A Tale Without Episodes</a><br /><br />2. A very strange article. As a computerlinguist I'm very interested in how they compute the evolution of a language. I can't imagine how this could work. There are several factors which seem to be just too random to be calculated. How did they trace the words? Did they use OCR to scan documents and match words? There is this particular paragraph, which I don't understand at all:<br /><br />"<span style="font-style: italic;">Looking to the future, the less frequently certain words are used, the more likely they are to be replaced. Other simple rules have been uncovered - numerals evolve the slowest, then nouns, then verbs, then adjectives. Conjunctions and prepositions such as 'and', 'or', 'but' , 'on', 'over' and 'against' evolve the fastest, some as much as 100 times faster than numerals.</span>"<br /><br />What does evolve mean here? Change? How can prepositions change? These things are bound by perception and are, in my opinion, as static as numerals: <a href="http://www.reading.ac.uk/about/newsandevents/releases/PR19825.asp">Scientists discover oldest words in the English language and predict which ones are likely to disappear in the future</a><br /><br />3. I'm quite into morbid things; like mummies and anatomical stuff. Lately, Pink Tentacle wrote a post about monster mummies and living mummies - formerly known as living Buddhist monks. They've killed themselves (slowly) in a timespan of 3000 days and now are relics of a sort: <a href="http://www.pinktentacle.com/2009/03/monster-mummies-of-japan/">Monster mummies of Japan</a><br /><br />4. Does your language influence your preference in music? Language and music are associated: <a href="http://scienceblogs.com/notrocketscience/2009/03/why_music_sounds_right_-_the_hidden_tones_in_our_own_speech.php#more">Why music sounds right - the hidden tones in our own speech</a><br /><br />5. A very basic lecture about language held at Yale University. It covers fundamentals of linguistics - Pidgin & Creoles, language as human trait, language universals, (innate) language capacity, phonetics, morphology, syntax, semantics (in this order) and so on: <a href="http://academicearth.org/lectures/how-do-we-communicate-language">How Do We Communicate?: Language in the Brain, Mouth and the Hands</a>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-62540616950604568282009-02-18T12:21:00.007+01:002009-02-18T13:15:03.481+01:00CryingVacation! Last semester was very intense, no time for anything. Gosh, and now I'm trying to learn Java... It's far more complicated than imagined. Python was neat and easy - in comparison - but Java is far more potent, so they say. I don't quite understand why I've to write code which seems to be obsolete - at least for a beginner like me , e.g.<br /><br /><pre><code>public class HelloWorld<br />{<br /> public static void main(String argv[])<br /> {<br /> System.out.println("Hello World!");<br /> }<br />}</code></pre><br />is equivalent to<br /><br /><pre>print "Hello World!"</pre><br />in Python.<br /><br />And next semester I'm going to die for sure... Here're my courses:<br /><br /><span style="font-weight: bold;">Formal Syntax</span>: I'm more the Morphology, Phonology, Semantics type. Actually Syntax is the only thing which gives me problems.<br /><span style="font-weight: bold;">Java</span>: Well, it's going to be hard I'd say.<br /><span style="font-weight: bold;">Logic</span>: I like formal logic, really.<br /><span style="font-weight: bold;">Artificial Intelligence</span>: Very very interesting. Inference, neuronal networks, genetic algorithms and that stuff - you know...<br /><span style="font-weight: bold;">Acoustic Phonetics</span>: This sounds good: Reading spectrograms and get in touch with VoiceXML.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-58625404366674870562009-02-17T16:08:00.004+01:002009-02-17T18:18:22.116+01:00Points of Interest 17. February, 20091. 20 years prison or death sentence for translating the Qu'ran? Welcome to our illustrious humanistic and democratic circle, dear Afghanistan: <a href="http://languagelog.ldc.upenn.edu/nll/?p=1151">The dangers of translation</a><br /><br />2. Help to classify galaxies and work for science, now! You'll be shown pictures of galaxies and asked questions about their spiral arms and other features. A task which you can do better than a computer and will help astronomy: <a href="https://www.galaxyzoo.org/">Galaxy Zoo 2</a><br /><br />3. The 2009 <a href="http://en.wikipedia.org/wiki/J%C5%8Dy%C5%8D_kanji">Jōyō Kanji</a> update is coming up and the two favourites are 俺 (ore, which means informal "I") and 誰 (dare, question pronoun "who?"). I think "dare" is quite common these days, even in formal context. Of course one can say donata which is more formal: <a href="http://no-sword.jp/blog/2009/02/joyo_list_to_level_up.html">Jōyō list to level up</a><br /><br />4. Speech perception is much more than hearing sounds. There are several other senses than hearing involved: <a href="http://www.eurekalert.org/pub_releases/2009-02/afps-rml021109.php">Read my lips: Using multiple senses in speech perception</a><br /><br />5. Steven Pinker explains the critical points of his book "The Blank Slate":<br /><br /> <iframe src="http://dotsub.com/media/7ab08bc5-c2f7-40c8-8f01-29b75aa37287/e/m" width="420" frameborder="0" height="347"></iframe><strong></strong>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-9915390287433160682008-12-22T12:48:00.006+01:002009-02-18T13:23:42.431+01:00Python MadnessMy Night at 3 AM: Hacking Python code to learn the language. That's a life! Here's what I've done so far:<br /><br />1. Decimal to dual conversion:<br /><br /><pre>import sys<br /><br />def bd(x):<br /> n = []<br /> if x < 0: <br /> return "Positive integer required"<br /> elif x == 0:<br /> return [0]<br /> else:<br /> while x > 0:<br /> n.insert(0,x%2)<br /> x = x/2<br /> bd(x)<br /> return n<br /><br />if __name__ == "__main__":<br /> try:<br /> number = int(raw_input("Number: "))<br /> print bd(number)<br /> except ValueError:<br /> sys.stderr.write("Integer required\n")</pre><br /><br />2. Basic truthtables:<br /><br /><pre>def logicalAnd():<br /> for valueOne in range(2):<br /> for valueTwo in range(2):<br /> print "%d %d %d"%(valueOne, valueTwo, valueOne and valueTwo)<br /><br />def logicalOr():<br /> for valueOne in range(2):<br /> for valueTwo in range(2):<br /> print "%d %d %d"%(valueOne, valueTwo, valueOne or valueTwo)<br /><br />def logicalConditional():<br /> for valueOne in range(2):<br /> for valueTwo in range(2):<br /> print "%d %d %d"%(valueOne, valueTwo, not valueOne or valueTwo)<br /><br />def logicalBiconditional():<br /> for valueOne in range(2):<br /> for valueTwo in range(2):<br /> print "%d %d %d"%(valueOne, valueTwo, valueOne is valueTwo)<br /><br />if __name__ == "__main__":<br /> op = raw_input("Connective: ")<br /> if op == "and":<br /> logicalAnd()<br /> elif op == "or":<br /> logicalOr()<br /> elif op == "conditional":<br /> logicalConditional()<br /> elif op == "biconditional":<br /> logicalBiconditional()<br /> else:<br /> print "Connective not known"</pre><br /><br />3. ASCII table. First column is the ASCII value, second column is the local interpretation, third column is the raw UTF-8 interpretation, fourth column is the hexadecimal value:<br /><br /><pre>for element in xrange(256):<br /> print "%s \t %s \t %s \t %s"%(element,%%<br /> chr(element), str(tuple(chr(element)))%%<br /> .strip("()'',"), chr(element).encode("hex"))</pre><br /><br />4. Perhaps a complicated dual to decimal program:<br /><br /><pre>def reverseRange(input):<br /> n = []<br /> for i in range(len(input)-1,-1,-1):<br /> n.append(i)<br /> return n<br /><br />def singleValues(input):<br /> m = []<br /> for i in input:<br /> m.append(i)<br /> return m<br /><br />if __name__ == "__main__":<br /> input = raw_input("Number: ")<br /> rR = reverseRange(input)<br /> sV = singleValues(input)<br /> dN = 0<br /> for i in range(len(sV)):<br /> dN += int(sV[i])*2**int(rR[i])<br /> print dN</pre><br /><br />This code works in 2.6.1Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-35231485779044644112008-12-21T19:39:00.003+01:002008-12-21T20:01:06.484+01:00Ubuntu On Samsung NC10Last month I bought Samsung's NC netbook and I'm astonished how cool it is. It's really handy if you travel a lot (I do!) and have a lot to code (I do!). Unfortunately, there are some issues which have to be solved first.<br /><br />1. Touchpad problem: It totally sucks when you write something and your fat and overdimensioned nerdy hand (or what I like to call it: The Hand Of Code) gets even slightly over the touchpad. Therefore you totally need to disable the touchpad for a certain time. Fortunately, the gods of Ubuntu created a program called 'syndaemon' which exactly works like this. It's useful to put it to your autostart via System > Preferences > Sessions.<br /><br />2. Excessive load cycle: It slowly kills your hard drive. So better follow <a href="https://wiki.ubuntu.com/PowerManagement">these instructions</a> to set the correct values. It seems that there are still issues even if you've changed the options.<br /><br /><br />For further information you should check out:<br /><br /><a href="http://nc10ubuntu.wordpress.com/">Ubuntu on the Samsung NC10</a><br /><a href="http://nc10linux.wordpress.com/">Linux on the Samsung NC10</a><br /><a href="https://help.ubuntu.com/community/NC10">The Ubuntu NC10 Community Documentation</a>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com1tag:blogger.com,1999:blog-281239845003259417.post-80627407468094293642008-12-19T12:15:00.000+01:002008-12-19T12:15:31.007+01:00Natural Language Processing Online ApplicationsI want to present an interesting link list with online and free to use interactive NLP related applications:<br /><br />1. <a href="http://decentius.aksis.uib.no/logon/xle.xml">XLE Web Interface</a> allows you to parse sentences of German, English, Norwegian, Welsh, Malagasy and Arabic. You'll get a very detailed parse tree and the functional structure of the sentence, for "This is madness!" you'd get:<br /><br /><div style="text-align: center;"><a href="http://img247.imageshack.us/my.php?image=xlewiku5.png" target="_blank"><img src="http://img247.imageshack.us/img247/6359/xlewiku5.th.png" alt="Free Image Hosting at www.ImageShack.us" border="0" /></a></div><br /><br />2. <a href="http://wortschatz.uni-leipzig.de/">Wortschatz Leipzig</a> is a German application that crawls the web for a word and returns a detailed analysis of the word frequency, collocations and semantic relations. The word-graphs are most interesting, e.g. the graph for "Humbug" (German for "rubbish"):<br /><br /><div style="text-align: center;"><a href="http://img514.imageshack.us/my.php?image=wlbq8.png" target="_blank"><img src="http://img514.imageshack.us/img514/4927/wlbq8.th.png" alt="Free Image Hosting at www.ImageShack.us" border="0" /></a><br /></div><br /><br />3. <a href="http://wordnet.princeton.edu/">WordNet</a> is a large lexical database for English, e.g. "house" would show following interpretations:<br /><br /><div style="text-align: center;"><a href="http://img155.imageshack.us/my.php?image=wnnp1.png" target="_blank"><img src="http://img155.imageshack.us/img155/2257/wnnp1.th.png" alt="Free Image Hosting at www.ImageShack.us" border="0" /></a></div><br /><br />4. <a href="http://www.answerbus.com/index.shtml">Answerbus</a> is a search engine like Google or Yahoo but with semantics! You can ask natural questions like "Who killed JFK?" and will (perhaps) get the answer "Oswald killed JFK". Perhaps... because the system actually sucks and you can easily outmaneuver it. Another search engine is <a href="http://start.csail.mit.edu/">START</a>, which sucks too.<br /><br />5. <a href="http://beta.visl.sdu.dk/visl/de/edutainment/games/wordfall.php">Wordfall</a> is an awesome linguistic game! It's like Tetris but instead of blocks you have to match words to their constituents. Look:<br /><br /><div style="text-align: center;"><a href="http://img228.imageshack.us/my.php?image=unbenanntnn1.png" target="_blank"><img src="http://img228.imageshack.us/img228/3080/unbenanntnn1.th.png" alt="Free Image Hosting at www.ImageShack.us" border="0" /></a><br /></div><br /><br />6. <a href="http://www.sfs.uni-tuebingen.de/~lothar/nw/">Wortwarte</a> is a German site about neologisms in the media. They are collected and sorted.<br /><br />7. A cool German chatbot called <a href="http://www.elbot.de/">ELBOT</a>. It would definitely pass my Turing Test.<br /><br />8. Think of a thing and <a href="http://www.20q.net/">20Q</a> will read your mind by asking 20 questions. <br /><br />9. Machine Translation is one of the prime disciplines of NLP. Everyone knows <a href="http://de.babelfish.yahoo.com/">Babelfish</a>. It's not only a translator in the Hitchhiker's Guide but also an online translator like <a href="http://translate.google.de/">Google Translation</a>. <br /><br />10. <a href="http://odur.let.rug.nl/~vannoord/TextCat/Demo/">TextCat</a> is a language guesser based on an n-gram Perl script. Another and better one would be the <a href="http://www.xrce.xerox.com/competencies/content-analysis/tools/guesser-ISO-8859-1.en.html">XRCE language guesser</a>.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-5520835823751112962008-12-03T13:37:00.010+01:002008-12-06T11:06:39.233+01:00N-Gram M-AdnessOne of the main basic concepts of Natural Language Processing is the model of n-grams. It's the splitting of a sequence into n-subsequences, e.g.<br /><br />(1) Now the lord once decided to set off for the mountain where the man lives<br /><br />For n = 1 (unigram) the sentence is splitted into:<br /><br />(n = 1) [ [Now] [the] [lord] [once] [decided] [to] [set] [off] [for] [the] [mountain] [where] [the] [man] [lives] ]<br /><br />For n = 2 (bigram) the sentence is splitted into:<br /><br />(n = 2) [ [Now the] [the lord] [lord once] [once decided] [decided to] [to set] [set off] [off for] [for the] [the mountain] [mountain where] [where the] [the man] [man lives] ]<br /><br />For n = 3 (trigram) the sentence is splitted into:<br /><br />(n = 3) [ [Now the lord] [the lord once] [lord once decided] [once decided to] [decided to set] [to set off] [set off for] [off for the] [for the mountain] [the mountain where] [mountain where the] [where the man] [the man lives] ]<br /><br />Okay you get the idea. The sequence of words "w<sub>1</sub>, ..., w<sub>k</sub>" is splitted into "w<sub>k</sub> and w<sub>k-1</sub>" for bigrams, "w<sub>k</sub> and w<sub>k-1</sub>, w<sub>k-2</sub>" for trigrams and "w<sub>k</sub> and w<sub>k-n+1</sub>, ... w<sub>k-1</sub>" in general.<br /><br />Here's the relevant Python code for making n-grams:<br /><code><br />def makeNGrams(inpStr, n):<br />    token = inpStr.split()<br />    nGram = []<br />    for i in range(len(token)):<br />        if i+n > len(token):<br />            break<br />        nGram.append(token[i:n+i])<br />    return nGram<br /></code><br /><br />Or a bit more condense:<br /><code><br />def makeNGrams(inpStr, n):<br />    inpStr = inpStr.split()<br />    return [inpStr[i:n+i] for i in range(len(inpStr)) if len(inpStr)>=i+n]<br /></code><br /><br />Why do you need this?<br /><br />1. Machine Learning uses n-gram models to learn and induce rules from strings.<br />2. Probabilistic models use n-grams for spell checking and correcting misspelled words.<br />3. Compression of data.<br />4. Optical character recognition (OCR), Machine Translation (MT) and Intelligent Character Recognition (ICR) use n-grams to compute the probability of a word sequence or generally a pattern sequence.<br />5. Identify the language of a text (<a href="http://www.xrce.xerox.com/competencies/content-analysis/tools/guesser-ISO-8859-1.en.html">demo here</a>)<br />6. Identify the species given a DNA sample.<br /><br />For example you can compute the probability of a sequence by multipling all previous probabilities: P(w<sub>k</sub>|w<sub>1</sub>, ..., w<sub>k-1</sub>) but if one of these previous sequences is zero, the whole expression will be zero too. This is a huge problem, since these long sequences are hardly ever seen in corpora, even if you take the internet, e.g. "The world, as we know it, will be changed by the pollution of the environment". Therefore we only take the direct predecessor by using an n-gram model and can estimate the probability. Another application for n-grams can be found in Part of Speech tagging and probabilistic disambiguation of tags, e.g. the probability of "book/NN the/DT flight/NN" versus the probability "book/VB the/DT flight/NN".<br /><br />I wrote a very simple program to predict the next word given a sequence of words in a corpus, e.g. input: "I will eat"; output: "fish" you can find it <a href="http://rapidshare.com/files/170750014/blatt6.py">here</a>.<br /><br />Another program concering n-grams, which I wrote, is available <a href="http://rapidshare.com/files/170750065/blatt7.py">here</a>. It extracts proper nouns, e.g. "New York City" from English texts.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-11323503333929044872008-11-14T16:30:00.003+01:002008-11-14T16:50:09.277+01:00Art of the StateCurrently I'm very busy in learning the formal foundations of linguistics (Set Theory, Relations), programming (Python) and computational linguistics (Maximum Likelihood Estimation, Context Free Grammars).<br /><br />I got an e-mail. Really. From a reader. By the way, it's my first; that's why I'm quite enthusiastic. He suggested to introduce my readers (do I have any?) to his blog <a href="http://neuropolitics.org/">Neuropolitics</a>. Well, well, I read the first sentences and decided to mention it in this post. Form your own opinion.<br /><br />Perhaps I'm going to write about the foundations of Natural Language Processing or upload some Python code which could be used to play with strings. Mostly harmless, not really meaningful code. I don't think that I'll have time for something else anyway.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-61300278151788932008-10-11T14:01:00.001+02:002008-10-11T14:03:48.437+02:00Time's running, running outUnfortunately, I don't have time to blog for the time being. I study natural language processing, which is very intense. So give me a week or two and I'll come back.Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-17812514587233338002008-10-05T11:47:00.000+02:002008-10-05T11:47:08.204+02:00Points of Interest 05. October, 20081. Please dear god make this an imperative:<br /><br /><div style="text-align: center;"><a href="http://xkcd.com/481"><img style="width: 150px; height: 150px;" src="http://imgs.xkcd.com/comics/listen_to_yourself.png" /></a><br /></div><br />2. Why don't apes use language although they could? Because they don't have a psychological infrastructure of shared intentionality. Bolles' Review of Tomasello's Power Point Prose: <a href="http://ebbolles.typepad.com/babels_dawn/2008/09/speakers-listen.html">Part 1</a> & <a href="http://ebbolles.typepad.com/babels_dawn/2008/09/how-fascinating.html">Part 2</a><br /><br />3. I always wondered which type of camouflage the US Army use since it looks like bad pixels of ancient computer days. Here's why they use digital camouflage: <a href="http://tierneylab.blogs.nytimes.com/2008/09/22/can-you-see-me-now/">Can You See Me Now?</a><br /><br />4. Dying of capsaicin? Well, I eat Habaneros every day - no joke: <a href="http://bayblab.blogspot.com/2008/09/which-organisms-can-feel-pain.html">Which organisms can feel pain?</a> & <a href="http://bayblab.blogspot.com/2008/10/chili-capsaicin-and-cancer.html">Chili, capsaicin and cancer</a><br /><br />5. Time for linguistic lolcat:<br /><br /><div style="text-align: center;"><a href="http://web.mac.com/arnold_zwicky/tenwords.png"><img style="width: 150px; height: 150px;" src="http://web.mac.com/arnold_zwicky/tenwords.png" /></a><br /></div><br />6. After several years of detective work, philologists at the University of Stavanger in Norway have collected a unique collection of texts online. Now they're about to start the most comprehensive analysis of middle English ever: <a href="http://www.alphagalileo.org/index.cfm?_rss=1&fuseaction=readrelease&releaseid=532383">New life for Middle English: Norwegian detective work gives new knowledge of the English language. </a><br /><br />7. Syntactic persistence is the tendency for speakers to produce sentences using similar grammatical patterns and rules of language as those they have used before. Although the way this occurs is not well understood, previous research has indicated that this effect may involve a specific aspect of memory function. Memory is made up of two components: declarative and procedural. Declarative memory is used in remembering events and facts. Procedural memory helps us to remember how to perform tasks, such as playing the piano or riding a bike. A recent study suggests that the common phrase, "it's so easy, it's like riding a bike" should perhaps be replaced with "it's so easy, it's like forming a sentence.": <a href="http://www.eurekalert.org/pub_releases/2008-09/afps-ura092308.php">Un-total recall: Amnesics remember grammar, but not meaning of new sentences</a><br /><br />9. Cool new robots from Japan with cool abilities: <a href="http://www.pinktentacle.com/2008/10/photos-robots-at-ceatec-2008/">Photos: Robots at CEATEC 2008</a><br /><br />10. It's the thalamus that actually matters for sentence processing: <a href="http://talkingbrains.blogspot.com/2008/09/thalamus-yes-basal-ganglia-nope.html">Thalamus? Yes. Basal ganglia? Nope.</a><br /><br />11. Beautiful statue: <a href="http://morbidanatomy.blogspot.com/2008/09/transi-de-ren-de-chalon-ligier-richier.html">"Transi de René de Chalon," Ligier Richier, 1547</a><br /><br />12. Broca's area shows a "sentence complexity" effect. It responds more during the comprehension of object relative (OR) constructions than easier to process subject relative (SR) constructions: <a href="http://talkingbrains.blogspot.com/2008/09/brocas-area-sentence-comprehension-and.html">Broca's area, sentence comprehension, and working memory</a><br /><br />13. <a href="http://scienceblogs.com/notrocketscience/2008/09/carbon_nanotechnology_in_an_17th_century_damascus_sword.php">Carbon nanotechnology in an 17th century Damascus sword</a><br /><br />14. A Bob Dylon song encoded in XML: <a href="http://languagelog.ldc.upenn.edu/nll/?p=659">Encoding Dylan</a><br /><br />15. Why choose the lesser evil?<br /><br /><div style="text-align: center;"><a href="http://web.mac.com/arnold_zwicky/IrregularDebate.jpg"><img style="width: 150px; height: 150px;" src="http://web.mac.com/arnold_zwicky/IrregularDebate.jpg" /></a><br /></div><br />16. Interesting, really: <a href="http://bayblab.blogspot.com/2008/10/how-to-beat-off-cold.html">How to beat of a cold</a><br /><br />17. <a href="http://scienceblogs.com/notrocketscience/2008/10/taking_the_new_out_of_neurons.php">Taking the new out of neurons</a><br /><br />18. <a href="http://scienceblogs.com/notrocketscience/2008/09/robostarfish_learns_about_itself_and_adapts_to_injuries.php">Robo-starfish learns about itself and adapts to injuries</a><br /><br />19: <a href="http://bayblab.blogspot.com/2008/10/2008-ignobels.html">2008 IgNobels</a>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-29866568695826229002008-09-22T21:16:00.000+02:002008-09-22T21:16:16.659+02:00Points of Interest 22. September, 20081. Listeneres can only keep up with the rapid rate of speech (5 syllables/second) because they anticipate the missing possible syllables of the word. A new study conducted by scientists of the University of Rochester and Georgia Tech showed that this is not only true for the phonology but also the semantics of words: <a href="http://www.eurekalert.org/pub_releases/2008-09/uor-swa091108.php">Scientists watch as listener's brain predicts speaker's words</a> & <a href="http://www.pnas.org/content/105/35/13111.abstract">Neural correlates of partial lexical activation</a><br /><br />2. At age 3–4, the overwhelming majority of children behave selfishly, whereas most children at age 7–8 prefer resource allocations that remove advantageous or disadvantageous inequality: <a href="http://www.nature.com/nature/journal/v454/n7208/abs/nature07155.html">Egalitarianism in young children</a><br /><br />3. The evolution of speech. Speech recognition part in macaques found: <a href="http://www.sciam.com/article.cfm?id=monkey-brains-hint-at-evolutionary-root">Monkey Brains Hint at Evolutionary Root of Language Processing </a><br /><br />4. World largest semantic map revealed. First steps toward Semantic Web? <a href="http://www.physorg.com/news140929129.html">Computers figuring out what words mean</a><br /><br />5. The right word is in our jaw: <a href="http://sciencenow.sciencemag.org/cgi/content/full/2008/915/2">Speaking Without Sound</a> & <a href="http://www.physorg.com/news140613973.html">Breakthrough in understanding of speech offers hope to the deaf</a><br /><br />6. Stuttering causes bilingualism: <a href="http://languagelog.ldc.upenn.edu/nll/?p=603">Does bilingualism cause stuttering?</a><br /><br />7. Neuroaesthetics? <a href="http://scienceblogs.com/neurophilosophy/2008/09/beauty_the_brain.php">Beauty & the Brain</a> and <a href="http://www.seedmagazine.com/news/2008/09/beauty_and_the_brain.php">Beauty and the Brain</a><br /><br />8. Save humanity. But first I want more funds for computational linguistics: <a href="http://www.acceleratingfuture.com/michael/blog/2008/09/funding-the-mitigation-of-extinction-risks/">Funding the Mitigation of Extinction Risks</a> and <a href="http://thebulletin.org/web-edition/features/how-can-we-reduce-the-risk-of-human-extinction">How can we reduce the risk of human extinction?</a><br /><br />9. Humans - The best race there is and ever was on earth? Stop kidding me, Lystrosaurus dominated more: <a href="http://www.acceleratingfuture.com/michael/blog/2008/09/technologies-to-watch-out-for-self-copying/">Technologies to Watch Out For: Self-Copying</a><br /><br />10. The geometric bucket a systematical view: <a href="http://scienceblogs.com/cognitivedaily/2008/09/a_simple_toy_and_what_it_says.php">A simple toy, and what it says about how we learn to mentally rotate objects</a><br /><br />11. Oh my arse: <a href="http://bayblab.blogspot.com/2008/09/evolution-of-assholes.html">The Evolution of Assholes</a><br /><br />12. The seven gates to humanity: <a href="http://ebbolles.typepad.com/babels_dawn/2008/09/seven-gateways.html">What I've Learned About Human Origins</a><br /><br />13. I like the picture of possible paths for human evolution: <a href="http://anthropology.net/2008/09/19/mark-stonekings-four-models-of-human-origins/">Mark Stoneking’s Four Models Of Human Origins</a><br /><br />14. About rhymes in Japanese Hip Hop and what they reveal about the language: <a href="http://no-sword.jp/blog/2008/09/experiment_like_a_scientist.html">I'll experiment like a scientist/ You wanna rhyme, you gotta sign my list</a><br /><br />15. <a href="http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0003083">“Thinking about Not-Thinking”: Neural Correlates of Conceptual Processing during Zen Meditation</a><br /><br />16. <a href="http://www.researchchannel.org/prog/displayevent.aspx?rID=16247&fID=4139#">Suicidal Individuals: Evaluation, Therapies, and Ethics – Part 1</a> & <a href="http://www.researchchannel.org/prog/displayevent.aspx?rID=16248&fID=4139">Part 2</a>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com2tag:blogger.com,1999:blog-281239845003259417.post-74893645399904014692008-09-17T09:43:00.014+02:002008-09-17T12:25:18.747+02:00Points of Interest 17. September, 2008<div style="text-align: justify;">I got a bit picky about the Points of Interest I choose these days; so less is more. A problem which occured while writing this and which is bothering me: What's the difference between:<br /><br />a) It is not<br />b) It isn't<br />c) It's not<br /><br />I think the first is the most emphasised because there is no contraction at all. The second emphasises the subject - due to the contraction of "is not" the stress shifts to "It". The third emphasises the negation because the stress lies on "not". Language Hat had <a href="http://www.languagehat.com/archives/001969.php">a post</a> about this in 2005.<br /><br />1. [...] findings suggest that New Caledonian crows can solve complex physical problems by reasoning both causally and analogically about causal relations: <a href="http://henry.simon.net.nz/stories/2008/09/17/do-new-caledonian-crows-solve-physical-problems-through-causal-reasoning/">Do New Caledonian crows solve physical problems through causal reasoning?</a><br /><br />Alex Taylor explains the experiment:<br /><br /><div style="text-align: center;"><object width="300" height="300"><param name="movie" value="http://www.youtube.com/v/M52ZVtmPE9g&hl=de&fs=1&rel=0"><param name="allowFullScreen" value="true"><embed src="http://www.youtube.com/v/M52ZVtmPE9g&hl=de&fs=1&rel=0" type="application/x-shockwave-flash" allowfullscreen="true" width="350" height="350"></embed></object></div><br /><br />2. Pro Transhumanism. It's not a matter of philosophy - It's a matter of time: <a href="http://www.acceleratingfuture.com/michael/blog/2008/09/transhumanism-as-universal/">Transhumanism as Universal</a><br /><br />3. About the temperature of excluding metaphors: <a href="http://scienceblogs.com/notrocketscience/2008/09/social_exclusion_literally_feels_cold.php">Social exclusion literally feels cold</a><br /><br />4. Pulvermuller's vs. Wernicke-Lichtheim's functional anatomy of language: <a href="http://talkingbrains.blogspot.com/2008/09/pulvermuller-wernicke-lichtheim.html">Pulvermuller = Wernicke-Lichtheim</a></div>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-61328664506035254822008-09-15T13:51:00.001+02:002008-09-15T13:51:53.427+02:00Points of Interest 15. September, 20081. An interesting article about <a href="http://scienceblogs.com/neurophilosophy/2008/09/neurobiology_of_a_hallucination.php">the neurobiology of a hallucination</a> based on Ffytche, D. (2008). The hodology of hallucinations. Cortex 44: 1067-1083. DOI: 10.1016/j.cortex.2008.04.005:<br /><br /><blockquote>"In the EEG experiments, the activity recorded from two of the electrodes was found to become sychronous whilst the subjects were hallucinating. [...] Ffytche hypothesizes that the changes in connectivity could be due to changes in the firing mode of the thalamo-cortical connections [...] Overall, Fytche's findings suggest that hallucination cannot be explained by a topological or hodological explanation alone, but instead by a combination of the two. [...]"</blockquote><br /><br />2. Gestalt meets linguistic relativism: <a href="http://ebbolles.typepad.com/babels_dawn/2008/09/what-ive-learne.html">What Bolles have learned about language</a>.<br /><br />3. "[...] some so-far anonymous computational linguist caused United Airlines to lose more than a billion dollars of its market capitalization, over the course of about 12 minutes last Monday: <a href="http://languagelog.ldc.upenn.edu/nll/?p=595">Economic linguistics</a><br /><br />4. <a href="http://scienceblogs.com/gnxp/2008/09/who_carried_out_911_views_diff.php">Who carried out 9/11? Views Differ...</a><br /><br />5. From E-Paper to Semantic Web. What kind of technologies could we expect in 2018? Nature asks: <a href="http://www.nature.com/news/2008/080903/full/455008a.html">What will happen in the next 10 years?</a>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com0tag:blogger.com,1999:blog-281239845003259417.post-963837944852510412008-09-04T09:27:00.004+02:002008-09-04T10:17:15.647+02:00One word != one numberEarlier this year a study was conducted by researchers from the University of Melbourne and University College London - namely <a href="http://www.icn.ucl.ac.uk/Staff-Lists/MemberDetails.php?Title=Prof&FirstName=Brian&LastName=Butterworth">Brian Butterworth</a>, <a href="http://www.psych.unimelb.edu.au/people/staff/ReeveR.html">Robert Reeve</a>, Fiona Reynolds and Delyth Lloyd. Children of two indigenous communities were tested for their numeracy skills; one from Tanami Desert and the other from Groote Eylandt. Another group were indigenous preschool children from Melbourne. Here's a map of the locations:<br /><br /><div style="text-align: center;"><a href="http://img262.imageshack.us/my.php?image=ausoutll8.png" target="_blank"><img src="http://img262.imageshack.us/img262/6899/ausoutll8.th.png" alt="Free Image Hosting at www.ImageShack.us" border="0" /></a><span style="text-decoration: underline;"><br /></span></div><br />The results showed clearly that the children of indegenious communities - who have no words or even gestures for numbers - have numeracy skills equal to native English speaking children. So numeracy is not based on culture or language but probably an innate facility.<br /><br /><br />Publications:<br /><ul class="publicationsul"><li id="oncite161271"><span class="pubauth">Butterworth, B., Reeve, R.</span> <span class="pubyear">(Forthcoming)</span>. <span class="pubtitle">Verbal counting and spatial strategies in numerical tasks: Evidence from indigenous Australia.</span> <i>Philosophical Psychology</i> </li><li id="oncite161277"><span class="pubauth">Butterworth, B., Reeve, R., Reynolds, F., Lloyd, D.</span> <span class="pubyear">(Forthcoming)</span>. <span class="pubtitle">Numerical thought with and without words: Evidence from indigenous Australian children.</span> <i>Proceedings of National Academy of Sciences of the USA</i> </li></ul>Laughing Manhttp://www.blogger.com/profile/17475724926007017604noreply@blogger.com1