Error/Warning

These are all based on my experience. These steps might not be the best methods. However, this is according to my experience and might save you long hours of stress. Feel free to drop a comment, question, suggestion or correction.

  • !  deny listing of directories centos

    To resolve this issue, I needed to modify /etc/httpd/conf.d/autoindex.conf, /etc/httpd/conf.d/userdir.conf , and /etc/httpd/conf/httpd.conf.

    Find where there is Indexes as part of the directory Options like

    Options MultiViews Indexes SymLinksIfOwnerMatch IncludesNoExec

    delete the word Indexes from all the options.

  • ⨷  [Theano] An update must have the same type as the original shared variable

    Call the code like this

    THEANO_FLAGS='floatX=float32' python xxx.py

  • ⨷  'numpy.float64' object cannot be interpreted as an index

    Cause by a depreciation in numpy 12.x.x. Install numpy 11.x.x or add .astype(np.int) to the part causing the error.

    pip install -U numpy==1.11.0.
  • ⨷  (Python )_csv.Error: field larger than field limit (131072)

    Happens when working with csv with large cell text. From a stackoverflow question, I found a solution ( to set

    csv.field_size_limit to maxInt

    maxInt = sys.maxsize
    decrement = True
    
    while decrement:
        # decrease the maxInt value by factor 10
        # as long as the OverflowError occurs.
    
        decrement = False
        try:
            csv.field_size_limit(maxInt)
        except OverflowError:
            maxInt = int(maxInt/10)
            decrement = True
    
    

    This Solution tries to avoid the overflow error.

     

  • ⨷  (Logstash JVM Heap Error) Error: Your application used more memory than the safety cap of 1G.

    Edit the /etc/logstash/logstash.jvm add or edit the lines

    -Xms4g
    -Xmx4g

    set the size of memory you want. Here I used 4GB

  • ⨷  Error elasticsearch fails to start after changing data and log directory

    set the correct permissions to the new data and log directories (i.e) set elasticsearch as the owner of these directories

    sudo chown -R elasticsearch  /path/to/directory
  • >_  Copy columns from postgres db table to tab delimited csv file
    copy table_name(column1, column2) to '/path/to/file.csv' delimiter E'\t' csv header;
  • >_  Find string recurssively int directory (Linux)
    grep -rnw '/path/to/folder/' -e "search string"
  • ⨷  Extract line range in file

    For example to extract from line 20397949 to line 20406761. The lines are extracted from extract_from_file and written to extract_to_file

    sed -n 20397949,20406761p extract_from_file > extract_to_file
  • >_  Find line location of string in text using grep
    grep -nr search_text File_name