Yes I know the Bash & Linux command line tools !
4.7 (3 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
13 students enrolled
Wishlisted Wishlist

Please confirm that you want to add Yes I know the Bash & Linux command line tools ! to your Wishlist.

Add to Wishlist

Yes I know the Bash & Linux command line tools !

Learn the foundations & develop the core skills on real-world examples
4.7 (3 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
13 students enrolled
Created by Sylvain Leroux
Last updated 9/2016
English
Price: $65
30-Day Money-Back Guarantee
Includes:
  • 3.5 hours on-demand video
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • use the syntax and features required for you daily Bash usage in interactive shell ("from the console")
  • leverage glob patterns, regular expressions, shell variables and standard commands to do more in typing less
  • perform faster, by using one or two commands, tasks that could require many clicks in GUI applications (checking file type & integrity, computing md5 or sha-2 sums, extracting ZIP or TAR archives,...)
  • process SVG, JSON and even images, videos or sounds through the command line
  • combine several basic commands to perform complex tasks
  • query web services from the Bash
  • use glob patterns to move tens or hundreds of files in only one command
  • aggregate, filter & process scientific, economic or financial data stored in a tabular text format (like CSV) without using a spreadsheet
  • gain the confidence & the skills required to improve your knowledge by yourself
View Curriculum
Requirements
  • You should have a working Linux system
  • You should be able to add new software using your package manager
Description

Why : The command line is a way to interact with a computer by typing commands instead of using a point-and-click interface. Historically, it was the only way to interact with a computer. Despite that, it is still of actuality today. Not because it's cool or makes you feel like you're part of some select club. But because using it, you can be more efficient in your daily work with the computer. Especially for repetitive tasks or tasks dealing with lots of data or files.

But command line can be daunting at first too : there is so much to know to use it. And it is all but intuitive to use. 

What : My goal is to provide you strong foundations so you can start using the Bash and Command Line Tools on Linux to be more efficient in your daily work. But most important, I want to give you the confidence & the skills required so you'll be able to improve your knowledge by yourself and according to your needs.

The course is not designed as an extensive list of all standard Unix & Linux command line tools. I don't think video would be the best medium for such reference material. If you're looking for that, try instead to check out some of the numerous books written on the topic, or search onto the Internet for freely available resources. In fact, such resources might be very good complement to this course.

How : The course is based on real world examples where the Bash & Command Line Tools are used to solve everyday problems. You are encouraged to try and experiment on your own system with the examples given in the lectures. The majority of lectures ends with a challenge you will have to solve. Most of the time, you will have to apply or make change to the commands seen in the course. But sometimes, this will require some experiments and researches on the Internet or in the online help system (the man).

You are encouraged to use the forum/QA Section to ask questions. But also to post your own solutions to the challenges. As well as to share tips & tricks related to the course.

Who is the target audience?
  • If you want to spend less time on repetitive tasks on your computer, this course is for you : using the command line, it is usually not much more difficult to apply a command to many files rather than to only one.
  • If you worked on Unix previously but want a refresher on the "new" features of the Bash, this course is for you : I made extensive use of modern Bash features not available for example in the historic Bourne Shell (sh)
  • If you are a student discovering the shell & the command line at school, but you want a different point of view or more examples, this course IS for you.
  • If you are regularly working with text data that need to be pre-processed, this course IS for you.
  • If you watched Matrix or Tron & though command line is cool but you have no clue where to start from, this course IS for you. And yes, commad line is cool. But no, I'm not Trinity...
  • If you don't know how to install Linux on your computer this course IS NOT for you. There are plenty of tutorials on the web to show you step by step how to install Linux. But feel free to consider this course again once you system will be up & running.
  • If you don't know how to add new softwares on your system using your package manager this course IS NOT for you. The package manager is a tool allowing you to install new software on your system. Each Linux distribution has its own package manager and usually provides one or several graphical tools to access it. I just can't provide the exact procedure for each and every Linux distro. Please take some time to search on the Internet how it works on your system. Then feel free to come back to consider this course again.
  • If you want to learn ZSH, KSH, TCSH or any shell other than the Bash, this course is NOT for you : I made extensive use of the Bash feeatures here. Most of them would probably be available in different shells, but the syntax is certainly not the same...
  • If you watched Jurassic Park and though than FSN, the "File System Navigator", is the ultimate way to navigate the file system, this course is NOT for you.
Compare to Other Linux Courses
Curriculum For This Course
40 Lectures
03:41:21
+
First steps
3 Lectures 15:29

Welcome to this course ! If you want to know why having played Lego prepared you to have the right mindset to use efficiently the command line tools, watch this video.

And if you didn't have access to Lego -- or didn't like played with them ? Don't worry ! The key concept is simple : you have basic building blocks, but to build something really interesting or useful, you have to combine those building blocks. And the whole course is about that.

But, if you come here because you're looking for some reference material on the Linux command line tools, may I suggest you to take a look at The Linux Documentation Project : 

  • Bash Beginners Guide : http://www.tldp.org/LDP/Bash-Beginners-Guide/html/index.html
  • GNU/Linux tools summary : http://www.tldp.org/LDP/GNU-Linux-Tools-Summary/html/index.html


In fact, I would say this course and those reference docs are rather complementary, so don't hesitate to bookmark them !

Preview 01:42

What do I need to follow this course ? Well, a computer running Linux and ... you will see the rest in this video. Don't worry, there is not much requirements. The only real skill you will need is being able to install softwares using your distribution package manager. If you're not sure about that, please take a few minutes to search on the Internet how to install new software on your particular Linux version. Usually, this is a graphical tool provided for that. But the details depend on your exact Linux distribution.

S02-Requirements
07:03

At least ! We will start to "play" with the terminal and try our first commands...

... but you will have to wait a few minutes at the start of the video, the time for me to explain the course organization.


The most important thing I want to mention here too is "the forum is your friend" -- even if the "forum" is called "Q/A section" in the new Udemy player ! Whatever is its name, this is the same concept : you, as a student have a way to ask questions and post messages the other students can see. Don't hesitate to use that features. If you have some difficulties of course. But also to share your own experience with this course. And to post your solution to the challenges. That way students will be able to compare there solutions, just like when we debrief after an exercise when I tech live.

Preview 06:44
+
The very basic core commands
9 Lectures 50:09

In this lesson you will learn how to download a file using the GNU wget command. This is quite easy and probably wouldn't deserve a lesson by its own. But I take that occasion to introduce you the the man. That is the inline manual available on virtually all Linux and Unix-like systems. 

Before the Internet era, it was our most important source of informations. When it wasn't simply the only one. And even today, the man is absolutely incomparable when looking for reference informations about a command.

As of myself, I probably use Internet 60% of the time and the man the remaining time : Internet is great and search engines are incomparable when you are looking for a quick help on some common use case for a command. But if you're looking some more advanced feature or for an obscure option, the man is still the place to look. So I think its a god habit to familiarize yourself with that tool. Even if it is not you're everyday tool -- knowing it exists can save your day once in a while !

S04B-Downloading a file using the command line
03:57

If the previous lesson, you've downloaded a .tar.gz file. This is a compressed archive file very common in the Linux and Unix world. In this lesson you will learn how to extract the content of that archive. In addition, you will continue to familiarize yourself with the basic command line tools and the core concepts behind them. Finally, you will, discover a nice tool that could help you to identify the exact type of some unknown file...

S05-Extracting an archive
09:17

The goal of this quiz is to train you in using the man ... or searching the Internet : after all, the important is to find the answer, whatever is the way you used to reach it.

S05Q-Getting informations [QUIZZ]
4 questions

In this lesson we will continue our journey exploring the directory structure using the unavoidable commands ls, cd or pwd -- and you will discover that ~/.. is not an obscure emoticon...

Sometimes the directory structure is called the "file system". But this is kind of language abuse as the file system is the set of underlying mechanism(s) used by the operating system to store and retrieve data on disk. Whereas the directory structure is the way the files are presented and made accessible to user space processes. That being said, I 'm probably guilty to having used the word filesystem for both concepts more than once...

S06-Navigating the directory structure
09:25

Ever heard those stories about a Trojan horse or some other malicious program being hidden as an "innocent" image file ? How could that be possible ? In this video you will see how we could be easily fooled by very simple tricks. And even if not considering security implications, the observation skill is a very important skill to develop when using the command line. Paying careful attention to subtle hints might help you to understand what's going wrong...


About GEEK's CORNER videos : Those videos are about more advanced, more subtle or less commonly used material than the rest of the course. That do not mean you should skip those videos. Simply the content might be more difficult to grasp at your current skill level. So, don't be afraid to go to the next lesson even if the content is not entirely clear after having watched a GEEK's CORNER video the first time -- and come back later when you'll feel more confident.

S07-File tricks [GEEK's CORNER]
04:45

If you're not familiar with that, as soon as we start talking about the file hierarchy you are bombarded with new concepts and vocabulary. I tried to summarize the key informations in that short animation, even some that I do not explicitly used in the course, but you might encounter by reading docs on Internet for example.

S06B-File hierarchy summary [animation]
01:39

In this quiz, you will test your understanding of the basic commands and concepts used to navigate the directory structure of your system.

The quiz will cover the concept of path, current working directory, home directory & parent directory, as well as the commands cd, ls, cp, mv & rm.

Feel free to use external resources or to try the questions on your own system : the goal is not to reach 100% the first time you try the quiz. But to reinforce your knowledge, and to train you in discovering things by yourself.

S06Q-Navigating the directory structure [QUIZZ]
6 questions

Things are getting serious now ! Up until now, you've mostly learn to do using the command lines things you could do using your file manager (aka, file browser or file explorer). But what about acting on several files at once ? For example, how to move hundreds of files in one step ? How to copy word-processor files containing the word "project", and only those files, when they are lost in a folder full of unrelated garbage files ? Those are typically situations where the command line will shine. Not because of the power of the ls or mv commands. But because of a Shell feature called pathname expansion

Instead of giving the exact name of the files you're interested in, you can use what is called a glob pattern to describe a set of files. And the Bash will use that pattern to build the list of matching file names to give to the underlying command you want to execute.

Seems complicated ? Few examples and a little bit of practice will make that obvious. But beware : this is a very addictive feature ! And once you will be familiar with, there are chances you will wonder how you could have lived without that before...

S08-Glob patterns
05:27

In this quiz, you will test your understanding of the glob patterns.

Some questions go beyond what is covered in the lecture. So feel free to use external resources or to try the questions on your own system : the goal is not to reach 100% the first time you try the quiz. But to reinforce your knowledge, and to train you in discovering things by yourself.

S08Q-Glob patterns [QUIZZ]
6 questions

Copying or moving files when you know where they are is easy. But it is not always that easy to remember where is that @#!? file. And sometimes I want to copy files scattered between several sub directories. All these are situations that could be handled gracefully by the find command. The really cool thing with the find command is not only if will ... find ... files, but it will also let you execute the command you want on the found files !

Here we will focus on finding files by their filenames. But by searching through the man, you will discover this command is really the swiss army knife for finding files, even based on the most obscure criteria.

S09-Finding files
05:19

So the find command allows complex search patterns and actions ? As an example, in this video we will see several ways to exclude a subdirectory from the search path. But the most important skill you will practice here is not to memorize the meaning of the -prune or \! arguments. But to gain that reflex of searching through the man for more advanced features of the commands you already know...

S10-Restricting the scope of find
04:54

The find command date back to the mid-70s. And is still the traditional way of recursively finding files into subdirectories. And is still the preferred way of doing for the most advanced use cases. But recursively finding files by name is such a common use case that soon or later it had to be incorporated into the shell. The Z shell (zsh) pioneered that idea in 1990. This feature appeared in 2009 into Bash 4.0 through the (disabled by default) globstar shell option.

Optional shell features can be turned on and off using the shopt build-in command. That mechanism allows the user to adjust the set of Bash features enabled on his system, either to tailor the shell to his own tastes, or to maintain compatibility with past usages. If you search for shopt into the Bash manual or on the Internet, you might discover some other goodies not mentioned here. Like the cmdhist and lithist [1] I discovered myself while recording one of the last videos of this course...


[1] http://unix.stackexchange.com/questions/109032/how-to-get-a-history-entry-to-properly-display-on-multiple-lines

Preview 05:26
+
Input & output
4 Lectures 22:14

Still using the find command, we will this time learn how we could control the way it prints matching files on the screen. This is called formatting the output of a command (a much less destructive thing than disk formatting ;).

Like many other tools offering that capabilities, the find command uses meta-characters and formating sequences like %p or \n for that purpose. Those are directly inherited from the standard printf [1] function of the C programming language. By the way, this is not mentioned in the video, but the Bash has a printf internal command you may use from the command line. It's a kind of echo command on steroids -- could I suggest you to take a look at man printf or man bash ? You might find that interesting.

Why ? Because controlling precisely the way results are reported can be a very useful (and life simplifying) feature when you want to use that result either in a full fledged application like a spreadsheet of, as we will see it later, as an input for another command.


[1] https://en.wikipedia.org/wiki/Printf_format_string

S12-Formatting the output of the find command
04:21

"Everything is a file" was one of the core concepts behind the Unix philosophy. That means, from a command perspective, printing to the screen and writing to a file is the same thing.

All commands we used so far displayed their result on the screen. But by definition, they can write their result to a file too. In fact, the large majority of them will not even notice the difference. All you have to do to redirect the output of a command to a file. To do that, simply add > followed by a file name after the command. And yes, it will work with any command...


[1] https://en.wikipedia.org/wiki/Everything_is_a_file

S13-Redirections
04:28

In the previous lesson we've introduced the notion of redirection. Output redirection to be precise. But there are other kind of redirections. A typical process, that is a running command or application, starts its life with one input and two outputs. By default the input is connected to the console keyboard and the outputs are connected to the console screen. But any of them can be redirected to/from a file.

And things can go even further, as the output of a process can be directly connected to the input of an other process. Not only this avoids creating an intermediate file, but as  side effect this allows both processes to run simultaneously (both at the same time), instead of sequentially (that is, one after the other). That cool mechanism is called a pipe -- probably because data flow between one process to another just like water flows in a pipe.


S14-Pipes
06:32

On the first few videos of this course I tried to consistently use double quotes when quoting was needed. But as course progressed, you probably noticed here and there I used single quotes instead. In many cases, you have the choice and can use which ever quoting style you prefer.

Nevertheless, there are subtle differences in the way the shell parse single- or double-quote arguments. After this lesson, you will be able identify the cases where you can omit quoting, those where you will have the choice between single- or double- quotes -- and the cases where you will have to use the right quoting style to obtain the intended result.

About GEEK's CORNER videos : Those videos are about more advanced, more subtle or less commonly used material than the rest of the course. That do not mean you should skip those videos. Simply the content might be more difficult to grasp at your current skill level. So, don't be afraid to go to the next lesson even if the content is not entirely clear after having watched a GEEK's CORNER video the first time -- and come back later when you'll feel more confident.

S15-Quotes [GEEK's CORNER]
06:53
+
Case study : cleaning up my music folder
6 Lectures 26:44

This short lesson is mostly a revision. You will discover the mess I made in My Music folder and we will start cleaning up things by removing all the unneeded files there and by dividing my audio and lyrics files between two different folders.

S16-(Re)Moving files
02:17

Do I have all the lyrics corresponding to my audio files ?

In this lesson I had a great idea : why not simply count the number of files both in the Audio and Lyrics directory ? If both numbers match, well, that means I do have all the lyrics. Isn't that a great idea ? Well, writing that, I now have a doubt...

Anyway, there is no standard tool to count files. But there is that wc tool able to count lines in a text file. Maybe there is a way to leverage the wc tool to count files instead...

S17-Counting things
03:00

Maybe just counting the number of files wasn't sufficient to ensure I really have the lyrics corresponding to the songs in my Audio folder. But I have a better idea now : displaying the listing of both directory side by side should allow me to spot the differences if any. Well, I hope...

Hopefully, this is the canonical use case for the paste command ! How lucky we are. Since there is not much difficulties here, this lesson was a great occasion for me to introduce process substitution [1]. And no, <() is not an emoticon either !

[1] http://tldp.org/LDP/abs/html/process-sub.html

S18-Pasting things
04:23

OK. It was not that easy to spot mistakes simply by comparing two list of files side by side. Especially if you feel concerned by typos. We could use the diff command to spot differences in a more reliable way.

Unfortunately, a raw comparison is useless since the filesnames are different anyway : audio files have the .mp3 extension, whereas my lyrics have a .txt or .pdf extension. It would be nice to have a tool to cut that extension before performing the comparison...

S19-Diff'ing things
04:16

Ouch : a two parts GEEK's CORNER... This must be a really tough subject !

In this lesson you will discover more in depth the way process are interconnected when you use a pipe or when you create a sub-shell by grouping commands using parenthesis. Understanding the material in this course will help you diagnosis apparently odd behaviors when building more complex commands. And probably is a key skill to create such commands !

And for those of you who always want more -- try to compare those two commands after having watched this video (beware of the spaces & semi-colons -- some, but not all, are required ;) :

  • ( cd /tmp ; ls ; )
  • { cd /tmp ; ls ; }


About GEEK's CORNER videos : Those videos are about  more advanced, more subtle or less commonly used material than the rest of the course. That do not mean you should skip those videos. Simply the content might be more difficult to grasp at your current skill level. So, don't be afraid to go to the next lesson even if the content is not entirely clear after having watched a GEEK's CORNER video the first time -- and come back later when you'll feel more confident.

S20-Combining commands 1/2 [GEEK's CORNER]
05:23

At this point you know you can combine commands using pipes or semi-colon (among others). But you may combine commands using the command list operators && and || too. But to fully understand them, you must first understand the concept of exit status.

To summarize (and to quote the Bash manual) :

  • A list [of commands] is a sequence of one or more pipelines separated by one of the operators ;, &, &&, or ||, and optionally terminated by one of ;, &, or <newline>.


So, after having completed this lesson you will be able to say "I know them all".

All ? No... Wait a minute : Did I mentioned the single-ampersand (&) in the course ? What could be it's role ? As a hint, this is not related to the exit status of the commands...


About GEEK's CORNER videos : Those videos are about more advanced, more subtle or less commonly used material than the rest of the course. That do not mean you should skip those videos. Simply the content might be more difficult to grasp at your current skill level. So, don't be afraid to go to the next lesson even if the content is not entirely clear after having watched a GEEK's CORNER video the first time -- and come back later when you'll feel more confident.

S21-Combining commands 2/2 [GEEK's CORNER]
07:25
+
Working on text files
6 Lectures 33:09

In this section, I will speak a lot about text files. Broadly speaking all the files on your computer can be spit in two major categories : text files and binary files. Historically, text files were preeminent in the Unix culture. The reason behind that (sometimes attributed to late Joseph Ossanna) was, text files are the simplest and more general way of storing data -- thus allowing to develop generic tools and increasing interoperability. After all, even if you don't have the specific tool required, at worst, you can still edit a text file "by hand".

Even today, text files and text data are still part of our every day digital life.The HTTP protocol request and answer headers are text data. The HTML page you are reading now is text data. Just like an SVG image or any other XML-based data format. This page, just like many apps running on your smartphone, uses JSON to exchange data with the Udemy server. And JSON is text. If you work with financial or scientific data, maybe you use CSV files ? CSV is a text format. Right in front of me, I see some LibreOffices documents on my Desktop. Those are (gzip compressed) text documents too [1] !

Great you will say. But that leads to still open this question : what is a text file ?


[1] https://en.wikipedia.org/wiki/OpenDocument#Specifications

S22-What is a text file ?
04:22

"Where did I read that ?" "When did he send me emails ?" Those are questions I regularly ask to myself. So, on my email client, I'm a great user of the <ctrl>-f keyboard shortcut. Maybe the mapping is different in your locale, but on my French Linux system, this is the usual shortcut for the "find" command in GUI applications. This is a nice tool. But not always sufficient. Because I'm not always seaching into my emails, but sometimes into thousands of source files. Or, even if only considered emails, I do not have to search into one mailbox. But on tens or hundreds of them. Data aggregation, disaster recovery, forensic analysis are few cases where you could have such needs.


There is a standard tool for that : this is grep -- not only it can show you lines in a text file matching some text, but you can use it the other way around to find files containing some text...

S23-Grepping files
04:24

In this lesson, I will introduce sed, the stream editor.

A stream editor is simply a tool processing data in a text file line by line. Either to filter them (discarding some line), or to edit them (altering their content). What makes sed great is you can program it. That is give instructions on how to process individual lines in the file. Not only this allows you to emulate other commands using sed. But most important this allows you to tailor processing to your own special needs. And despite its apparent simplicity, it can be a really really powerful tool.

S24-Introducing sed
08:36

In the previous video, I introduced the concept of regular expression. A regular expression is nothing more that a pattern describing a set of strings :

  • The French Social Security number is made of 15 digits. Assuming no particular formating a regular expression matching any French SSN number could be [0-9]{15}
  • A regular expression matching any word starting by a and ending by e would be a\w*e
  • The set of all lines made of an odd number of characters could be written as ^(..)*$

Regular expressions are really a great tool. And an important part of the Unix culture. You will see them used in many different tools. But the drawback is they can be really complicated quickly. In this lesson will will dig a little bit more into that complexity, and see you you could use the Internet resources to help you in writing or understanding regular expressions.
http://xkcd.com/208/


About GEEK's CORNER videos :
Those videos are about more advanced, more subtle or less commonly used material than the rest of the course. That do not mean you should skip those videos. Simply the content might be more difficult to grasp at your current skill level. So, don't be afraid to go to the next lesson even if the content is not entirely clear after having watched a GEEK's CORNER video the first time -- and come back later when you'll feel more confident.


S25-More on regular expressions [GEEK's CORNER]
07:35

Regular expressions are made to identify matching string or characters. But once you have a match, sed can perform a substitution based on the matching pattern. Think of that like an overpowered search-replace feature.

Need examples beyond those provided in the lecture ? Here are few :

  • s/Windows/Linux/g will replace all occurrences of Windows by Linux 
  • s/^(.)(.*)$/\1/ will keep only the first letter of each line
  • s/([0-9]{2})-([0-9]{2})-([0-9]{4})/\3\2\1/ reformat a date from the DD-MM-YYYY format to YYYYMMDD


Did you notice the little g at the end of the first example above. What could be its meaning ?


http://www.grymoire.com/Unix/Sed.html#toc-uh-6

https://www.gnu.org/software/sed/manual/sed.html#The-_0022s_0022-Command


S26-Substitutions
07:05

Maybe you feel a little bit confuse about regular expressions ? Those provided in the lesson are too complex ? Or you feel like things are going too fast ? Don't panic !

Here are few simple textbook-style examples. I provided the link to the corresponding regex101 snippet [1] , so you will be able to practice and experiment at your will and speed.

And remember, it takes time to develop your regex skills : so practice, practice, and practice again !


[1] https://regex101.com/r/tC2nI5/3

S26B-Regular expression examples [animation]
01:07
+
Batch renaming files
2 Lectures 18:45

Did you ever had to change some filename extension from .jpe to .jpeg for standardization issues ? Or to replace spaces by underscore in filenames because of some obscure regulation ? On simple to ensure all your MP3 or video files have a consistent naming scheme ?

If you have only one file to rename, doing it by hand is not a big deal. If you have few files to rename it starts to be annoying. And as of myself, if I had even, say 10 files to rename, I would consider writing a program to to that for me. But such program already exists and is called prename. On Debian-based system, prename is part of the perl package. But this is simply a few line Perl script. So I for some reason you have Perl on your system, but not prename, I provided the required source to download [1]. 

And this is a great occasion to show you how to check the integrity of a file you've downloaded using md5 [2] or sha-2 [3] (sha-256 in that case) sums. And to learn how to make a script executable using chmod +x. And how to modify the PATH to make it accessible easily from the shell.


[1] https://gist.github.com/s-leroux/64b47b89938bc76589857013f7b67f17

[2] https://en.wikipedia.org/wiki/MD5

[3] https://en.wikipedia.org/wiki/SHA-2


S27-Installing prename
12:06

Now, I'm sure prename is available on your system. So let's practice a little bit that batch renaming tool. And could we find a better occasion than by checking my movies filenames consistency ?

Wait a minute ? What does that means "batch renaming" a file ? Well, this is a way to say we will apply the same transformations to several filenames -- actually renaming an arbitrary number of files in only one command ! Of course, in case of mistake, we're at risk of trashing my entire movie collection by giving silly names to all those files. But you'll act carefully. Well, I hope...

S28-Using prename
06:39
+
Working with CSV files
2 Lectures 09:00

Of course you remember the cat command. Did I mentioned it can be used to (con)catenate several files together ? Well, anyway that was in the man.

So let's consider that as a revision. Or an occasion to discover the CSV files we will use in this section...

S29-Catenate files
02:09

Data in a CSV file can be seen as tabular data. And when I speak about CSV files, in my mind I use CSV as a generic term. There a plenty of CSV-like text formats -- using something else that a coma to separate data. For example, take a look at the /etc/passwd [1] file on your computer...

The great news is you can filter and reorder data in those files using only a few standard commands. sed, cut and awk are the only tools required. Well in fact, awk is all you need...

[1] https://en.wikipedia.org/wiki/Passwd

S30-Filtering and reordering columns in a CSV file
06:51
+
Working with Web data
4 Lectures 21:38

awk is a full-fledged programming language -- and probably deserves a course on its own. Here, we will stick with the basic, but I wanted to show you how it can be used to perform simple processing of your CSV data.

If you're dealing regularly with scientific or financial data, maybe it worth taking time to investigate that tool a little bit more. You might find it really helpful to automate some repetitive tasks...

S31-More awk
03:01

HTML is the lingua franca of the web. An huge majority of web pages are written using one or the other variant of HTML. We will not dig into that language. The only interesting thing for us now is : this is a text format. So all tools we already know can be used to process or create such files.

That being said, for more advanced usages and if you are familiar with the web technologies (CSS selectors) and if you have the freedom to do that, installing a dedicated tool such as pup [1] or hxselect [2] is probably a better option. Those tools are not covered in this course. 

But what is covered here is the "for" loop, a Bash construct allowing to execute the same commands on each item of a list of data (all files of a folder, all words in a file, ...)


[1] https://github.com/ericchiang/pup

[2] https://www.w3.org/Tools/HTML-XML-utils/

S32-Looping
05:58

Today, "web data" is not longer synonym of HTML or XML only. More and more web services allows to query and receive data as JSON [1].

Like HTML in the previous lesson, JSON is a text data format and can be processed using the standard text tool you already know. But this time, you will use a dedicated tool called jq [2] instead, to gather poster images for the videos of my movie collection.

[1] https://en.wikipedia.org/wiki/JSON

[2] https://stedolan.github.io/jq/

S33-Querying web services
07:06

Text, text, text... We spend a lot of time working with text data. But what about handling images using the command line ?

As a transition, we will see how using a few simple commands we can create a bunch of SVG images

Is this difficult ? No : SVG is text. A textual description of a vector image. When you open an SVG file with your Web browser or any other SVG-capable application, it will draw the image corresponding to that description. Including the changes you might have made to the source code using our old friend the standard tools...

S34-Tweaking SVG images
05:33
+
Working with binary data
3 Lectures 22:54

We already "worked" with images in the previous lesson. And it wasn't even in the same section of the course... Was that a mistake ?

No, because the important word is bitmap. A bitmap image (sometimes called a raster image) describes an image pixel by pixel [1]. Usually by giving its color (in the RGB space or example). An "HD" image has a resolution of 1920x1080, that is 2 million of pixels. This is a lot of data. So for efficiency, most bitmap image formats use a binary format instead of a text format. This means we need different tool to process them. Dedicated tools. Not only to deal with the binary representation of the data. But also because we need features specific to image processing. Here comes ImageMagick [2] to the rescue...

[1] https://en.wikipedia.org/wiki/Raster_graphics

[2] http://www.imagemagick.org

S35-Working with bitmap images
09:50

In this lesson I will introduce the SoX [1] package to create, process and convert audio files. It will not replace a Digital Audio Workstation (DAW) like Ardour or an audio editor like Audacity. But if you have some repetitive tasks to perform on audio files, or if you have to synthesize sounds (for music) or reference wave forms (for scientific applications) this is a tool to consider.

[1] http://sox.sourceforge.net/

S36-Working with sounds
04:37

This is our last lesson. Last but not least, I will introduce here ffmpeg -- a wonderful tool to manipulate audio/video files.

There are very few things that can't be done using ffmpeg. And it understands countless codec and container formats. But how this tool can be complicated to use ! With many advanced command line options, options whose meaning vary depending where they appear in the argument list, and codec/muxer specific options this is the most complex tool I regularly use. Once again, this tool should deserve a course on its own. Here we will only perform some transcoding, and I will show you you you can create a video from still images.

Few remarks though. Since 2011 [1], some Linux distributions (Debian notably) are packaging a different tool called avconv as ffmpeg. This caused a lot of confusion, without much benefices for 4 or 5 years. At such point, the "real" ffmpeg will be back in future (post-Jessie) Debian version. I tried to ensure the command I gave here would work both with the real ffmpeg, and with avconv.

But as collateral damages, other tools might not work or simply can be absent if your distribution is avconv-only. Most notably the mplayer tool I will use in this lesson. If you can't find mplayer in your package manager, I would suggest mpv as a replacement.

[1] https://en.wikipedia.org/wiki/Libav#History

S37-Working with videos
08:27
+
Conclusion
1 Lecture 01:19

This is the last video of this course. This is so sad...

S38-Final word
01:19
About the Instructor
Sylvain Leroux
4.7 Average rating
3 Reviews
13 Students
1 Course
Engineer by passion, teacher by vocation

For 15 years, I taught computer sciences & information technology to students of all ages, from adults to young teenagers.

When I teach, I have two goals : share my enthusiasm for what I teach, and prepare my students so they could improve their knowledge by themselves.

I encourage you to take a look at my StackOverflow profile (link below) to see some of my favorite technologies -- as well as to check in my answers there I always try to be respectful for people looking for help, while providing accurate answers.