Search code examples
indexingsplunksplunk-query

New CSV file not syncing with index Splunk


I'm facing problem with Splunk like there is an index having a folder of some csv file as a data input. when i'm adding another CSV file in that folder for that index, new source data is not showing in index. I have restarted Splunk many time and delete index and recreate but problem is still there.

I haven't added any configuration for that folder.

Do i need to add any conf for that folder if yes please help me i'm new with splunk.

One more thing If i check the file count in settings> Data inputs for folder it is showing correct but when i search any query with mapped index then there is some problem and showing less file as expected.

Default Inputs.conf file is :

[default]
index = default
_rcvbuf = 1572864
host = $decideOnStartup

[blacklist:$SPLUNK_HOME/etc/auth]

[blacklist:$SPLUNK_HOME/etc/passwd]

[monitor://$SPLUNK_HOME/var/log/splunk]
index = _internal

[monitor://$SPLUNK_HOME/var/log/watchdog/watchdog.log*]
index = _internal

[monitor://$SPLUNK_HOME/var/log/splunk/license_usage_summary.log]
index = _telemetry

[monitor://$SPLUNK_HOME/var/log/splunk/splunk_instrumentation_cloud.log*]
index = _telemetry
sourcetype = splunk_cloud_telemetry

[monitor://$SPLUNK_HOME/etc/splunk.version]
_TCP_ROUTING = *
index = _internal
sourcetype=splunk_version

[batch://$SPLUNK_HOME/var/run/splunk/search_telemetry/*search_telemetry.json]
move_policy = sinkhole
index = _introspection
sourcetype = search_telemetry
crcSalt = <SOURCE>
log_on_completion = 0

[batch://$SPLUNK_HOME/var/spool/splunk]
move_policy = sinkhole
crcSalt = <SOURCE>

[batch://$SPLUNK_HOME/var/spool/splunk/...stash_new]
queue = stashparsing
sourcetype = stash_new
move_policy = sinkhole
crcSalt = <SOURCE>

[fschange:$SPLUNK_HOME/etc]
#poll every 10 minutes
pollPeriod = 600
#generate audit events into the audit index, instead of fschange events
signedaudit=true
recurse=true
followLinks=false
hashMaxSize=-1
fullEvent=false
sendEventMaxSize=-1
filesPerDelay = 10
delayInMills = 100

[udp]
connection_host=ip

[tcp]
acceptFrom=*
connection_host=dns

[splunktcp]
route=has_key:_replicationBucketUUID:replicationQueue;has_key:_dstrx:typingQueue;has_key:_linebreaker:indexQueue;absent_key:_linebreaker:pars$
acceptFrom=*
connection_host=ip

[script]
interval = 60.0
start_by_shell = true

[SSL]
# SSL settings
# The following provides modern TLS configuration that guarantees forward-
# secrecy and efficiency. This configuration drops support for old Splunk
# versions (Splunk 5.x and earlier).
# To add support for Splunk 5.x set sslVersions to tls and add this to the
# end of cipherSuite:
# DHE-RSA-AES256-SHA:AES256-SHA:DHE-RSA-AES128-SHA:AES128-SHA
# and this, in case Diffie Hellman is not configured:
# AES256-SHA:AES128-SHA

sslVersions = tls1.2
cipherSuite = ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA$
ecdhCurves = prime256v1, secp384r1, secp521r1

allowSslRenegotiation = true
sslQuietShutdown = false

Solution

  • You need to create a new monitor stanza that points to where your CSV file is located. For example,

    [monitior:///home/user/data/myfile.csv]
    index= csv_data
    

    You can add it through the web GUI in some cases (for example, a single Splunk instance), or you may need to modify the files.

    Refer to https://docs.splunk.com/Documentation/Splunk/8.0.5/Data/Monitorfilesanddirectorieswithinputs.conf and https://docs.splunk.com/Documentation/Splunk/8.0.5/Admin/Inputsconf

    It is best practice not to modify the default inputs.conf file. Instead, you should create a file at /opt/splunk/etc/system/local/inputs.conf and include the monitor stanza there.