{"error":"MergeMappingException[Merge failed with failures {[mapper [local_ip] of different type, current_type [long], merged_type [string]]}]","status":400}
在Elasticsearch上游逛了一圈,上面这样写到
(http://www.elasticsearch.org/blog/changing-mapping-with-zero-downtime/)
the problem — why you can’t change mappings
You can only find that which is stored in your index. In order to make your data searchable, your database needs to know what type of data each field contains and how it should be indexed. If you switch a field type from e.g. a string to a date, all of the data for that field that you already have indexed becomes useless. One way or another, you need to reindex that field.
...
OK,这一段话很合理,我改了一个field的类型 需要对这个field进行reindex,如论哪种数据库都需要这么做,没错.
我们再继续往下看看,reindexing your data, 尼玛一看,弱爆了,他的reindexing your data不是对修改的filed进行reindex,而是创建了一个新的index,对所有的filed进行reindexing, 太逆天了。
现在的问题是 这是一个线上服务,不能停服务,所以我需要一个倒数据到我的新索引的一个方案
Elasticsearch官网写到
pull the documents in from your old index, using a scrolled search and index them into the new index using the bulk API. Many of the client APIs provide a reindex() method which will do all of this for you. Once you are done, you can delete the old index.
第一句,看起来很美好,找了一圈,尼玛无图无真相,Google都没有例子,你让我怎么导数据?
第二句 client APIS, 看起来只有这个方法可搞了
python用起来比较熟,所以我就直接选 pyes了,装了一大堆破依赖库之后,终于可以run起来了
1
2
3
4
5
6
7
8
import pyes
conn = pyes.es.ES("http://10.xx.xx.xx:8305/")
search = pyes.query.MatchAllQuery().search(bulk_read=1000)
hits = conn.search(search, 'store_v1', 'client', scan=True, scroll="30m", model=lambda _,hit: hit)
for hit in hits:
#print hit
conn.index(hit['_source'], 'store_v2', 'client', hit['_id'], bulk=True)
conn.flush()