Looking for someone with Chinese knowledge


We’re looking to implement CJK Support in Open Source Full Text search engine Sphinx .
Initially we’re thinking to base search ob bi-gram indexing to keep it simple, especially as according to research papers it offers decent quality for most cases. This is not that complex to implement however there is no way we can test it as we have zero knowledge of Chinese or Japanese.

If you know Chinese Japanese or Korean and would like us help us testing Sphinx support for these languages let us know. No special development skills are required. If you’re reading this blog you should be technical enough.


Share this post

Comments (25)

  • islue Reply

    I’m a native Chinese speaker and know a little Japanese. I’d like to do some help.

    January 11, 2007 at 6:10 pm
  • YoungWoo Kim Reply

    Hi, I’m Kim and I’m Korean
    I’m living in Seoul, Korea and now working for ‘Daum Communications’ as a DBA (Oracle, MySQL)
    I wanna test Sphinx CJK support.

    -YW Kim

    January 11, 2007 at 6:51 pm
  • jedy Reply

    I’m Chinese, also have some Japanese knowledge. And I’d like to help to test.

    January 11, 2007 at 7:00 pm
  • Hao Reply

    Hi there, I’d like to help your testing of Chinese, write to fu2009@gmail.com if I can join 😛

    January 11, 2007 at 8:00 pm
  • Sun Reply

    I am from china,and I Would like to join this test.
    Is that OK?

    January 11, 2007 at 8:19 pm
  • Nick Zhao Reply

    Hi Peter, I’m a Chinese guy living in Dalian, China. I’m a big fun of LAMP though only have little
    knowledge of them. But if you just want someone who knows Chinese much better than you and also
    desires to help, please feel free to contact me via email or MSN.

    P.S., please prepare to bear my poor English and I’d better let you know that I just began to learn LAMP
    for a couple of days. 🙂

    Best wishes.


    January 11, 2007 at 10:08 pm
  • liang Reply

    I am a chinese. I have 3 years c++ program experices. I like opensource project. Please contect me and I’d like to test Spinx.

    January 11, 2007 at 10:42 pm
  • Dale Reply

    Peter, just as an FYI, I’ve actually implemented this in Sphinx for edgeio.com. You can see it in action at:


    However, I don’t think we’re contributing the code back to Sphinx. We used bigrams along with proximity relevance scoring. Based on what I’ve seen, the relevance ranking is pretty good. So far we’re just doing Chinese UTF-8. We have some folks in China who have done some testing with it.

    My knowledge of Chinese was just good enough to get by here, but I’d be interested in seeing how your effort goes, and helping out a bit if I can.

    January 12, 2007 at 1:45 am
  • mshk Reply

    Hi, I’m Japanese web programmer and intrested in testing Sphinx.
    How can I help you?

    January 12, 2007 at 2:10 am
  • hongqn Reply

    Peter, I’m a Chinese programmer and I’d like to help. I have good Python/C skills and enough knowledge of CJK character encoding, just FYI.

    January 12, 2007 at 2:57 am
  • peter Reply

    Thank you guys,

    I have not expected so many people to respond so quickly. We’ll now look into how organized it best and will contact ones who provided emails and post some information here.

    January 12, 2007 at 3:06 am
  • Philip Tellis Reply

    You should collaborate with the Namazu developers (http://www.namazu.org/index.html.en). Namazu is a search engine made primarily for CJK languages, but also works with English. The engine is written in C, and the indexer is written in perl. I’ve found their code fairly easy to read and follow (and I do not know any of CJK), and submitted a few patches in the past. The developers are quite helpful.

    January 13, 2007 at 8:39 am
  • peter Reply

    5 Sun:

    Your email does not seems to be working. Please contact us if you see this.

    January 14, 2007 at 8:22 am
  • frank Reply

    I am chinese ,I like your products and I always use them. I want to help you.

    January 14, 2007 at 8:56 am
  • Gu Lei Reply

    Hi Peter,

    I’m Chinese. I also want to join that test. Contact me if needed.

    January 14, 2007 at 6:59 pm
  • Bill Reply


    I am interesting in this testing. Chinese and Japanese is ok for me.



    January 14, 2007 at 8:40 pm
  • Eric Reply

    i can test chinese using osx. eric18 @ gmail . com

    January 15, 2007 at 9:24 pm
  • yejr Reply

    hi,peter,i’m the owner of http://imysql.cn,i'm Chinese,i’m a DBA, i’m skilled with MySQL optimization, i would like to join with you 🙂

    January 16, 2007 at 6:29 am
  • Louis Reply

    hi, i’m a chinese. i hope to join the testing. please contact me: liukaixuan@gmail.com

    January 16, 2007 at 8:13 pm
  • Josh Reply

    hi, peter, I am a Chinese web programmer, 3 years PHP experience, if you want to test Sphinx CJK Support on Debian AMD64, please contact me.

    epaulin AT gmail dot com

    January 16, 2007 at 10:03 pm
  • xLight Reply

    I am a PHP/MySQL web application programmer.
    Have been a MotherBoard Tester.

    January 16, 2007 at 11:08 pm
  • Lisa Lan Reply

    I am interesting in this testing. I am a oracle and mysql DBA , I’m Chinese .

    January 17, 2007 at 6:41 pm
  • anakin Reply

    i am a Chinese,3 years LAMP experience,and interesting in search technology,please contact me if you need.
    anakinsun AT gmail.com

    November 16, 2007 at 12:11 am
  • KayL Reply

    I’m Chinese, no skills.
    feel free to contact me if you need.

    October 17, 2008 at 8:13 am
  • Galen Reply

    How is the progress with this Sphinx Chinese language search test?

    The quality of Chinese language search also pretty much depends on the quality of word segmentation.

    I am wondering if we can do just unigram when indexing (though bigger index) and do word segmentation for user submitted search query (or ask users to segment their query, that makes sense as they know what they want to search for). and then we use sphinx to search using, say, maximum length match, and relevance sorting etc.

    Does it make sense this way if we can not beat Google/Baidu on word segmentation.

    January 17, 2009 at 4:47 pm

Leave a Reply