0

私は自分のプロジェクトでdjango-sphinxを設定しましたが、これはしばらくの間だけ完全に機能します。後で、常に空の結果セットを返します。驚くべきことに、djangoアプリを再起動すると修正されます。そして、検索は再び機能しますが、これも短時間(または非常に限られた数のクエリ)でのみ機能します。これが私のsphinx.confです:

source src_questions
{
    # data source
    type        = mysql
    sql_host    = xxxxxx
    sql_user    = xxxxxx #replace with your db username
    sql_pass    = xxxxxx #replace with your db password
    sql_db      = xxxxxx #replace with your db name
    # these two are optional
    sql_port    = xxxxxx
    #sql_sock   = /var/lib/mysql/mysql.sock

    # pre-query, executed before the main fetch query
    sql_query_pre   = SET NAMES utf8

    # main document fetch query
    sql_query       =       SELECT q.id AS id, q.title AS title, q.tagnames AS tags, q.html AS text, q.level AS level \
                            FROM question AS q \
                            WHERE q.deleted=0 \

    # optional - used by command-line search utility to display document information
    sql_query_info  = SELECT title, id, level FROM question WHERE id=$id

    sql_attr_uint   = level
}

index questions {
    # which document source to index
    source      = src_questions

    # this is path and index file name without extension
    # you may need to change this path or create this folder
    path            = /home/rafal/index/index_questions
    # docinfo (ie. per-document attribute values) storage strategy
    docinfo     = extern

    # morphology
    morphology  = stem_en

    # stopwords file
    #stopwords  = /var/data/sphinx/stopwords.txt

    # minimum word length
    min_word_len    = 3

    # uncomment next 2 lines to allow wildcard (*) searches
    min_infix_len = 1
    enable_star = 1

    # charset encoding type
    charset_type    = utf-8
}

# indexer settings
indexer
{
    # memory limit (default is 32M)
    mem_limit   = 64M
}

# searchd settings
searchd
{
    # IP address on which search daemon will bind and accept
    # optional, default is to listen on all addresses,
    # ie. address = 0.0.0.0
    address     = 127.0.0.1

    # port on which search daemon will listen
    port        = 3312

    # searchd run info is logged here - create or change the folder
    log     = ../log/sphinx.log

    # all the search queries are logged here
    query_log   = ../log/query.log

    # client read timeout, seconds
    read_timeout    = 5

    # maximum amount of children to fork
    max_children    = 30

    # a file which will contain searchd process ID
    pid_file    = searchd.pid

    # maximum amount of matches this daemon would ever retrieve
    # from each index and serve to client
    max_matches = 1000
} 

そして、views.pyからの私の検索部分はここにあります:

content = Question.search.query(keywords)
    if level:
        content = content.filter(level=level)#level is array of integers

ログにエラーはなく、結果が返されません。'indexer --rotate --all'をcronで5分ごとに実行するように設定しましたが、searchedは常に稼働しています。すべての助けをいただければ幸いです。

4

1 に答える 1

0

スフィンクスのバージョンは?ジャンゴスフィンクス?sphinxsearch API? パイソン?

とにかく、 から を削除してみて、indexerこのcron問題が解決しないかどうかを確認してください。これがどうなるか教えてください。

于 2010-10-29T18:07:27.427 に答える