View on GitHub

Settings_Linux

Useful bash scripts and settings for managing a Linux server

Post-upgrade of fishfish

4/9/2018~4/22/2018

By Adam Lu


  1. Organized files:

    • Based on this discussion, moved all custom bash scripts from /usr/bin/ to /usr/local/bin/:

      mv /usr/bin/*.sh /usr/local/bin/
      
  2. Cron jobs:

    • Needed to reset public keys so that the servers trust each other and rsync can be run without a password.

    • In a regular terminal on fishfish:

      • Generated a public/private rsa key pair **for the user **adam (I couldn’t use root because I didn’t know the root password on chalkboard):

          ssh-keygen
        
      • When prompted, saved the files at the default location /home/adam/.ssh/id_rsa
      • When prompted, entered nothing for passphrase.
      • Output:

        Generating public/private rsa key pair. Enter file in which to save the key (/home/adam/.ssh/id_rsa): Enter passphrase (empty for no passphrase): Enter same passphrase again: Your identification has been saved in /home/adam/.ssh/id_rsa. Your public key has been saved in /home/adam/.ssh/id_rsa.pub. The key fingerprint is: SHA256:QHgmYsho7DiY4kxN7jYJbLmO5vm5PitK5TmExSsBMJc adam@fishfish The key’s randomart image is: +—[RSA 2048]—-+ |O… .. | |+E.o.o | |+=+ +. | |B+o. . | |+++. S | | +==. | |o..+. | |o+..o | |=o+*+ | +—-[SHA256]—–+

      • Used ssh-copy-id to copy the public key to fishfish:

          ssh-copy-id -i ~/.ssh/id_rsa.pub 128.143.16.133
        
      • Output:

        Now try logging into the machine, with: “ssh ‘128.143.16.133’”, and check to make sure that only the key(s) you wanted were added.

      • No longer needed a password to ssh into Chalkboard
    • In a regular terminal on chalkboard:

      • Generated a public/private rsa key pair **for the user **adam (I couldn’t use root because I didn’t know the root password on fishfish):

          ssh-keygen
        
      • When prompted, saved the files at the default location /home/adam/.ssh/id_rsa
      • When prompted, entered nothing for passphrase.
      • Output:

        Your identification has been saved in /home/adam/.ssh/id_rsa. Your public key has been saved in /home/adam/.ssh/id_rsa.pub. The key fingerprint is: 6c:74:2e:eb:2f:82:9c:5d:27:1d:50:18:55:62:5e:cc adam@chalkboard The key’s randomart image is: +–[ RSA 2048]—-+ | .==+o | | oo oE | | ..o | | o o. | | S… | | .ooo | | . + ..o | | + o.. | | ..o. | +—————–+

      • Used ssh-copy-id to copy the public key to fishfish:

          ssh-copy-id -i ~/.ssh/id_rsa.pub 128.143.16.164
        
      • Output:

        Now try logging into the machine, with “ssh ‘128.143.16.164’”, and check in:

        ~/.ssh/authorized_keys

        to make sure we haven’t added extra keys that you weren’t expecting.

      • Still needed a password to ssh to fishfish
    • Try adding a private key for chalkboard based on this website
      • Started an SSH Agent:

          ssh-agent $SHELL
        
      • Loaded the private key to the SSH agent:

        ssh-add
        
      • Still needed a password to ssh to fishfish

    • This thread explains that ssh will not use the public key if the .ssh directory is writable to users other than the owner

      • Changed the permissions for /home/adam/ on fishfish to 755 (was 775).
    • Modified the cron jobs to make sure writable permissions agree with each other

      • In Webmin, changed the User of the cron jobs comprehensiveBackup3.sh and notebookCopier.sh and barrettlabBackup.sh to adam
      • In the files /usr/local/bin/comprehensiveBackup3.sh and /usr/local/bin/notebookCopier.sh on chalkboard, changed all instances of mark@ to adam@
      • In the file /usr/local/bin/barrettlabBackup.sh on fishfish, changed all instances of mark@ to adam@
    • Updated the notebook bash scripts
      • Modified /usr/local/bin/bookieFinalGrads.sh and /usr/local/bin/bookieFinalUnderGrads.sh:
        • Removed /public_html/notebookspdf/ from paths
        • Removed /bin/ from bash script paths
      • In Webmin, changed the User of the cron job bookieCallerGRADS.sh and bookieCallerUNDERGRADS.sh) to adam
        • Can’t be other users (such as mark) because the folder /home/adam/ must have permissions 755 for the backup scripts to work.
      • On both servers, changed permissions for all bash scripts under /usr/local/bin/ to 755 and changed the group to LabFolks
    • Fixed blAnimalNotice.sh

      • Problem #1: It was reading the primary census taker of the following day
      • Problem #2: The backup census taker was not matched to the right day (see Excel sheets in /media/shareX/Adam/20180412_blAnimalProblem)
      • Solution: The files eegNoticeGeneWean.sh and eegNoticeGeneWeanWeek.sh have been streamlined and updated to make the backup census taker the primary census taker of the following day. See the comments in these files for detail.
    • Backed up older versions of Mark’s bash scripts in /usr/local/bin to /media/markX/binbash_fishfish. Backed up eegNoticeGeneWean.sh and eegNoticeGeneWeanWeek.sh in /media/adamX/Settings_Linux

    • Problem of cron jobs not running:

      • Phenomenon: Cron jobs work when you run the script from Webmin, but not when you schedule it.

      • Solution: Debugged by including the following in the cron job command:

         blanimalNoticeToAdamOnly.sh \
         	1> /media/shareX/Adam/20180412_blAnimalProblem/blAnimalNotice.out \
         	2> /media/shareX/Adam/20180412_blAnimalProblem/blAnimalNotice.err
        
      • Explanation: crontab has its own default working directory and path. It is hard coded in the source code:

         #ifndef _PATH_DEFPATH
         # define _PATH_DEFPATH "/usr/bin:/bin"
         #endif
        
         #ifndef _PATH_DEFPATH_ROOT
         # define _PATH_DEFPATH_ROOT "/usr/sbin:/usr/bin:/sbin:/bin"
         #endif 
        

        Therefore, the jobs stopped working when I moved the jobs to /usr/local/bin

      • Work around:

        1. Move the scripts to /bin or /usr/bin

          • This is not preferred because it will be more difficult to manage user-implemented scripts differently from system-installed ones)
        2. Use the full path to every script

          • This is also not preferred because the jobs won’t work if the scripts were moved in the future
        3. Source the /etc/environment file before submitting the cron job:

          . /etc/environment; script.sh
          
  3. Users/Groups

    • Created these user accounts

      • On fishfish: jordan, daniel

      • On chalkboard: jordan, daniel, ashley

      • Notes for Jordan Rodu and Daniel Keenan:

        We have two servers. I’ve made an account for each of you. The user names are daniel and jordan, and the password is the user name for now, but once you login you will be prompted to changed the password.

        Here are the IP addresses for the servers:

        Server name: fishfish

        IP: 128.143.16.164

        Server name: chalkboard

        IP: 128.143.16.133

        You can use X2Go Client or MobaXTerm to ssh into our servers. If you use X2Go, you’ll have to choose the XFCE or MATE desktop environments for the session type.

        I’ve created folders on fishfish for you to write to. These are /media/shareX/Jordan/ and /media/shareX/Daniel/, respectively. Please copy files and make new files in these folders instead of your home directories. This is because we do not make the home directories accessible via Samba. I’ve also created soft links to those folders in your home directories once you log in. Also in your home directory are other links that take you to directories that you might be interested in.

        After you reset your password, you can also map the following network locations from Windows:

        \128.143.16.164\adamX (/media/adamX)

        \128.143.16.164\shareX (/media/shareX)

        \128.143.16.164\MatlabFishFish (/home/Matlab)

        \128.143.16.133\barrettlab (/home/barrettlab)

        Project 1: Classify events in adrenal cell calcium-imaging data

        The main data/input/output directory for this project is /home/barrettlab/detect_with_minEASE/on chalkboard.

        The time-series from each cell is already converted to matfiles under the subdirectories in all_data/. Some example data selected for testing are under test_data/. The code used to analyze the data is compiled into a standalone. Use this command to run it:

        bash /home/barrettlab/detect_with_minEASE/minEASE_Linux_p11/run_minEASE.sh $MCR_ROOT It’s easiest to run the command from the folder input_Excel_file_p8/, which contains the newest version of the input Excel files. The input Excel file for running the test_data is miniTest_chalkboard_init_p8.xlsx. You can modify the parameters in that file to get a taste of what each parameter does to the detection. However, don’t modify the miniAll Excel files as that will potentially change the outputs that the Barrett lab has already spent time annotating. You can use the init mode to run through all automatic detection without opening the GUI, then run again with the check mode to check the results with the GUI. If you change parameters, you can either rename the test_output directory and rerun under the init mode, or you can run under the rerun mode, which automatically archives the previous result.

        The code for the minEASE program can be found in /home/Matlab/minEASE/ on *fishfish*. There are some dependent files in other subdirectories of /home/Matlab/. The locations of the custom functions are all documented at the top of the files that use them. For instance, the code for baseline noise calculation is the file rms_Gaussian.m under /home/Matlab/Kojis_Functions/. Currently, the code is set of readable only. Once the lab GitHub account is set up, we can then collaborate via GitHub. But feel free to copy and paste for now.

        Project 2: Fitting single TC neuron models to dynamic clamp data

        This project is all on *fishfish*. The data directory is /media/adamX/m3ha/data_dclamp/take4/. The model directory is /media/adamX/m3ha/optimizer4gabab/. The NEURON code are the .mod**, **.tem and *.hoc files. The code for fitting is a little complicated, so I won’t elaborate here. The code is mostly annotated, so feel free to look through them, but it’s not as well annotated as the code for minEASE. I have a test code that you could use and modify if you just want to run the NEURON model once. It’s test_MATLAB_unix_NEURON.m.The attached notebook file contains descriptions of the relevant mechanisms that are inserted into each of the three compartments in a model thalamocortical neuron. The equations for updating the current densities (or total current for the GABAB mechanism, which is only inserted in the somatic compartment) are described in the files. Then the current densities are used to update the voltage levels in each compartment. The equation governing the latter is a function of membrane resistance, axial resistance and membrane capacitance, which in turn are functions of the diameters and lengths (which are also parameters to be fitted by passive current impulse response data) of the cylinders. It amounted to 20 parameters to fit across trials and 6 parameters to fit across cells.

        Lastly, as we discussed before, you can already access the servers if you are off ground using VPN. If you are on ground, we will need your IP addresses. Please let me know if you have further questions. Thanks for taking an interest in this!

    • Created the Collaborators group on both servers:

      • These accounts have this group as primary on fishfish: barrettlab, daniel, jordan
      • These accounts have this group as primary on chalkboard: barrettlab, baylisslab, daniel, jordan
    • Changed all home directory permissions to (user):(primary group) on both servers

    • Removed unnecessary accounts from the sudoers group, and added ashley to both

    • In Webmin, under Samba Windows File Sharing, turned on User Synchronization and Group Synchronization (clicked Yes to all questions)

  4. Cron jobs (cont’d)

    • Apparently crontab operates on a very rudimentary shell, so it does not use any environmental variables to look for paths. i.e, the $PATH environmental variable is not used. Therefore when I removed /bin/ from the cron job commands, crontab never found the commands (it will only look under the user’s home directory).

    • Interestingly, if you use Webmin to run the scripts, it will be able to find the bash scripts even if you don’t have the full path (which is what caused me to remove /bin/ in the first place). So basically you can’t really tell if a Cron job works or not through Webmin.

    • Checking the system log file with this command:

      grep -C 2 CRON /var/log/syslog/
      

      doesn’t really help either, as it prints the same thing regardless of whether the job succeeds…

    • Reinstated /usr/local/bin/ to the commands (that’s where the bash scripts are now) and they seem to be executed. Used a modified version of test_sleep.sh for testing purposes. It generates a log file with the time stamp every second in the /tmp/ directory.

  5. UFW:

    • Displays warning message:

      WARN: Duplicate profile ‘Apache’, using last found WARN: Duplicate profile ‘Apache Secure’, using last found WARN: Duplicate profile ‘Apache Full’, using last found

    • Based on this discussion, investigated the directory /etc/ufw/applications.d/, and realized (with vimdiff) that the files apache2.2-common and apache2-utils.ufw.profile are exactly the same. Therefore, removed the older one (apache2-utils.ufw.profile). This removes the warning

  6. Back up settings: Created backup_settings.sh under /media/adamX/Settings_Linux/ and created a cron job that schedules root to run it every hour

    # Hard-coded parameters
    backupFolder='backup_settings'
    
    # Files to back up
    declare -a filesToBackup=("/etc/passwd" "/etc/group" 
                            "/etc/hosts" "/etc/shells"
                            "/etc/environment" 
                            "/etc/fstab" "/etc/crontab" 
                            "/etc/ntp.conf")
    
    # Get the server name
    serverName=$(hostname)
    
    # Get the path to current file
    path=$(dirname $(readlink -f "$0"))
    
    # Get the user name
    userName=$(whoami)
    
    # Back up files to backupFolder
    for file in "${filesToBackup[@]}"
    do
        # Take out the first / from the path
        fileMod=${file:1}
    
        # Replace / with _ globally in the path
        fileMod=${fileMod//\//_}
    
        # Replace . with DOT globally in the path
        fileMod=${fileMod//./DOT}
    
        # Create the full path of backup file
        backupPath="${path}/${backupFolder}/backup_${fileMod}_${serverName}.log"
    
        # Use rsync archive mode to back things up
        rsync -avhu --progress $file $backupPath        
    done
    
    # Back up firewall settings
    backupPath="${path}/${backupFolder}/backup_ufw_IPs_${serverName}.log"
    if [[ ${userName} == "root" ]] ; then
        # Do not need sudo as root
        ufw status numbered > $backupPath
    else
        # Need sudo if not root
        sudo ufw status numbered > $backupPath
    fi
    
    # Change permissions for these log files to 775 and group to LabFolks
    allLogFiles="${path}/${backupFolder}/*.log"
    if [[ ${userName} == "root" ]] ; then
        chgrp LabFolks ${allLogFiles}
        chmod 775 ${allLogFiles}
    else
        sudo chgrp LabFolks ${allLogFiles}
        sudo chmod 775 ${allLogFiles}
    fi
    
  7. Installed the trash-cli utility

    • Command:

      sudo apt-get install trash-cli
      
    • The utility provides the following commands:

      trash-put           trash files and directories.
      trash-empty         empty the trashcan(s).
      trash-list          list trashed files.
      trash-restore       restore a trashed file.
      trash-rm            remove individual files from the trashcan.
      
    • Added this line to ~/.bash_aliases

      clean='trash-put *~'