logstash - Can anyone give a list of REST APIs to query elasticsearch? -


  1. i trying push logs elasticsearch through logstash.
  2. my logstash.conf have 2 log files input; elasticsearch output; , grok filter. here grok match:

    grok {   match => [ "message", "(?<timestamp>[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2},[0-9]{3})  (?:\[%{greedydata:caller_thread}\]) (?:%{loglevel:level})  (?:%{data:caller_class})(?:\-%{greedydata:message})" ]    } 
  3. when elasticsearch started, logs added elasticsearch server seperate index name mentioned in logstash.conf.

  4. my doubt how logs stored in elasticsearch? know stored index name mentioned in logstash.

'http://164.99.178.18:9200/_cat/indices?v' api given me following:

health status index pri rep docs.count docs.deleted store.size pri.store.size

yellow open   tomcat-log      5   1       6478            0      1.9mb          1.9mb   yellow open   apache-log      5   1        212            0      137kb       137kb  
  1. but, how 'documents', 'fields' created in elasticsearch logs.

i read elasticsearch rest based search engine. so, if there rest apis use analyze data in elasticsearch.

indeed.

curl localhost:9200/tomcat-log/_search 

will give first 10 documents total number of docs in index.

curl localhost:9200/tomcat-log/_search -d '{   "query": {     "match": {       "level" : "error"     }   } }' 

might gives docs in tomcat-log have level equal error.

have @ section of book. help.


Popular posts from this blog

c# - ODP.NET Oracle.ManagedDataAccess causes ORA-12537 network session end of file -

matlab - Compression and Decompression of ECG Signal using HUFFMAN ALGORITHM -

utf 8 - split utf-8 string into bytes in python -