Find online courses made by experts from around the world.
Take your courses with you and learn anywhere, anytime.
Learn and practice realworld skills and achieve your goals.
Programming interviews are like standard plays in professional sport  prepare accordingly. Don't let Programming Interview gotchas get you down!
What's Covered:
Talk to us!
Mail us about anything  anything!  and we will always reply :)
Not for you? No problem.
30 day money back guarantee.
Forever yours.
Lifetime access.
Learn on the go.
Desktop, iOS and Android.
Get rewarded.
Certificate of completion.
Section 1: Introduction  

Lecture 1  01:46  
Programming/Coding Interviews stand between you and your dream job. 

Lecture 2  18:59  
Coding interviews are nothing like software engineering jobs, they tend to be stressful and focus on the hardest parts of a software engineer's jobs. However, getting your core concepts right, with a lot of practice is the secret sauce to cracking the coding interview.  
Section 2: Pointer and Arrays  
Lecture 3  20:00  
Pointers are the foundation of all hard interview problems, visualizing memory layout using pointers helps us understand what's really happening under the covers.  
Lecture 4  16:30  
Practice is the key to understanding key pointer concepts, solve pointer problems by visualizing the memory layout. Arrays are pointers at heart, work with them exactly like you would with pointers. 

Lecture 5  13:04  
Pointers know how much space the data they point to occupies. Which means you can access all elements of an array with a single pointer, pointer arithmetic is pretty cool!  
Lecture 6  09:23  
A whole bunch of practice with pointer problems. These should be easy because you will have the memory layout right there to help you visualize things.  
Section 3: Strings are just pointers at heart  
Lecture 7  17:46  
Strings are character pointers which are equivalent to character arrays. Solve functions from the string.h library for practice dealing with pointers. 

Lecture 8  10:48  
Pointers as arguments to functions have subtleties which need to be understood to use them correctly. Understand reassignment and modifications of pointers in a function and see how the original pointers are affected in the calling code.  
Lecture 9  20:00  
Let's solve some harder problems from the string.h library, remember the little details like string termination, null inputs all matter in an interview!  
Section 4: Linked lists can be fun!  
Lecture 10  12:04  
Pointers just hold addresses of a memory location, which means we can have pointers to pointers to pointers. Sounds complicated? No worries, there are examples to help you understand these every step of the way.  
Lecture 11  13:10  
Pointers to pointers requires a heightened conceptualization of memory layout. See detailed visuals on how pointer modification and reassignment work. User defined types or structs can also have pointers to them, memory layout and visualization of struct pointers.  
Lecture 12  16:54  
Linked lists are favorite interview questions for software engineering roles. If you can work linked lists you're on your way to tackling more complicated problems. Understand the memory set up of linked lists and start with a few problems to gain confidence. Get the length of a linked list, access the nth element in a list, and append an element to the end of the list  all this while handling null lists and other details. 

Lecture 13  07:04  
For a simple concept, linked lists can get surprisingly tricky very quickly. Practice and practice again to gain mastery over linked list problems Append an element to the end of the list  all this while handling null lists and other details. 

Lecture 14  20:01  
For a simple concept, linked lists can get surprisingly tricky very quickly. Practice and practice again to gain mastery over linked list problems. Use the linked list as a stack and implement pop, delete all the elements in a list  tricky memory freeing here, insert an element at a specified position and in a sorted list  these are similar but the edge cases differ. 

Lecture 15  18:25  
Once you've actually solved all the examples we've got so far, you'll find that linked lists are fun! This class has a particularly difficult sample problem which will introduce you to the fast and slow pointers which traverse a linked list at different speeds. Useful for a lot of tricky maneuvering. Append one list to another and split a list into two by using fast and slow pointers. This second problem is much, much harder than it seems. 

Lecture 16  18:25  
By now you should be able to solve linked list problems in your sleep. Let's practice a few last ones to gain complete master over them. Remove duplicates from a sorted list, move the first node from one list to another, merge two sorted lists and finally reverse a linked list. 

Section 5: Bit Manipulation  
Lecture 17  10:47  
We dig into the bitwise AND, OR and NOT operations  visually inspecting how they work.  
Lecture 18  09:33  
We continue with bit manipulation  the right shift and left shift operators are very powerful, but they have 2 issues that you should be sure to understand: overflow, and fill.  
Lecture 19  13:34  
Before diving headlong into bit manipulation problems it's helpful to learn a few useful tricks which help you build a strong foundation to visualize working with bits.  
Lecture 20  13:43  
Functions to get the nth bit of an integer and to set the nth bit of an integer. These are the building block functions and the concepts underlying these will be used for harder bit manipulation problems. 

Lecture 21  19:47  
Print all the bits used to represent an integer from the most significant bit to the least significant. Learn some subtle details about the shift right (>>) with negative numbers! Count the number of 1s in an integer, and learn a neat trick which allows you to do it in complexity O(number of 1s). 

Lecture 22  11:21  
Reverse the bits in an integer. This pulls together a whole bunch of stuff from the last few problems. As in the case of hard problems, visualizing the process is key to solving this!  
Section 6: General programming problems  practice makes perfect  
Lecture 23  19:27  
During coding interviews you might encounter questions which you can work out from first principles. You should be nailing these! Let's start with figuring out whether a string is a palindrome and finding all the points within a certain distance from another point.  
Lecture 24  11:09  
Play the game of life where every cell can change states from live to dead based on its neighbours.


Lecture 25  10:49  


Lecture 26  20:00  
Run length encoding involves specifying the number of times a character is repeated in a string. Decoding runlengthencoded strings can be pretty tricky, let's find a solution for both. If a number were represented by its digits, can you write code to add 2 numbers represented in this way? Let's walk through a solution and see if you can get this right. 

Lecture 27  20:00  
Write code to check whether a Sudoku board is valid. This should work for both complete and incomplete boards. Sudoku is tricky and this has many conditions to check. Lastly set up your own numeric system and then increment a number represented in that system by 1. 

Section 7: BigO Notation, Sorting And Searching Algorithms  
Lecture 28  15:56  
What is the performance of your code? How do you measure this? What is complexity and what is its relationship with performance? 

Lecture 29  18:09  
The Big O notation is used to express complexity based on the size of the input specified for any algorithm. How is Big O expressed, how is it calculated and many examples to drive the concepts home!  
Lecture 30  18:16  
The Big O notation becomes much clearer when you practice find the complexity of some sample pieces of code. Let's see how many of these you get right!  
Lecture 31  10:05  
A sorting algorithm is not just defined by its complexity, there are a whole bunch of other characteristics which can be used to determine which sorting algorithm is the right one for a system. Let's understand what these characteristics are and what are the trade offs we might make.  
Lecture 32  14:36  
The simplest and most naive sorting algorithm.  
Lecture 33  14:43  
Closely allied with selection sort is bubble sort. Its an adaptive sort with the same time complexity as selection sort.  
Lecture 34  15:34  
Insertion sort is an improvement over both bubble sort and selection sort. Let's see how exactly it works and why it's preferred in many cases. 

Lecture 35  15:13  
Shell sort builds on top of insertion sort, it improves the complexity of it's running time by partitioning the list in a clever way.  
Lecture 36  20:00  
This belongs to a class of algorithms which uses divide and conquer to break the problem set into smaller pieces. This also makes a timespace trade off to get a faster running time.  
Lecture 37  19:16  
Quick sort is the sort of choice for developers of programming libraries. Let's see what makes it so attractive.  
Lecture 38  12:00  
Binary search is a pretty nifty way to search through a sorted list in O(Log N) time. It's also an interview favorite so make sure you understand it well!  
Section 8: Recursion and the recursive sense  
Lecture 39  17:50  
Recursion is pretty hard at the beginning. Let's look at an example of reversing a string and see how we can use recursion to solve the problem. Visualize the input and every step and see how the magic of recursion works.  
Lecture 40  14:24  


Lecture 41  16:43  
A classic problem which can be solved recursively. Try out a few smaller sets and see how their subsets look to find patterns.  
Lecture 42  16:32  
Binary trees lend themselves to problems which have really beautiful recursive solutions. The problem may seem hard but the solutions end up being simple. Checking for whether 2 trees are the same is one such problem. 

Lecture 43  13:42  
Paint fill allows you to color regions on screen while using drawing software. Implement a recursive solution to paint fill a region on the display screen.  
Lecture 44  15:54  
Say you were given all the tasks needed to build a complete car. These tasks may depend on one another. Set up a data structure to represent a task and it's dependencies and write code to build a car. 

Lecture 45  16:18  
An anagram of a word is simply a word with the letters of the original word rearranged. This is complicated but lends itself well to a recursive solution.  
Lecture 46  12:45  
There can be several paths out of a maze. Help a rat placed anywhere in maze to find it's way out.  
Lecture 47  19:48  
Another classic problem with an elegant recursive solution.  
Section 9: Stacks And Queues  
Lecture 48  15:36  
The stack is a very simple and easy to understand data structure. However it lies underneath many complicated real world problems and is incredibly useful.  
Lecture 49  16:35  
Let's build a stack for real using Java. It'll have all the operations we're interested in  push, pop, peek, size etc. It can hold any data type, it's a generic class.  
Lecture 50  11:19  
Problems which use stacks as a part of their solutions are very common in programming interviews. Matching parenthesis to check for well formed expressions is a classic interview question  let's solve this using the stack we're already implemented.  
Lecture 51  08:47  
Another interview question implemented. You have space available but your processing needs to be very fast indeed. How would you keep track of the minimum element of a stack as it changes?  
Lecture 52  13:16  
The queue belongs to the same linear data structure family as the stack but it's behavior is very different. Queues are much more intuitive as there are plenty of real world examples where a queue is the fair and correct way of processing.  
Lecture 53  19:59  
A common, fast but slightly tricky implementation of the queue is the array where the last element wraps around to the first. An interview favorite, let's see how to implement the circular queue. 

Lecture 54  17:36  
We know the stack, and we know the queue. This problem brings them together. It's possible to mimic the behavior of a queue using 2 stacks in the underlying implementation. Let's write the most efficient code possible to make this work.  
Section 10: Binary Trees  
Lecture 55  13:28  
The binary tree is an incredibly useful hierarchical data structure. Many other, more complex data structures, use the binary tree as the foundation. Let's see what a binary tree looks like and learn some simple terminology associated with the tree.  
Lecture 56  17:23  
Traversing a binary tree can be done in variety of ways. The breadth first traversal visits and processes nodes at every level before moving on to the next. Let's visualize breadth first traversal and see how it's implemented.  
Lecture 57  14:43  
Depth first traversal can be of 3 types based on the order in which the node is processed relative to it's left and right subtrees. Preorder traversal processes the node before processing the left and then the right sub trees. 

Lecture 58  13:13  
Depth first traversal can be of 3 types based on the order in which the node is processed relative to it's left and right subtrees. Inorder traversal processes the left subtree, then the node itself and then it's right sub trees. Postorder traversal processes the node *after* it's left and right subtrees. The algorithms are all remarkably similar and very easy once you use recursion. 

Section 11: Binary Search Trees  
Lecture 59  09:25  
A Binary Search Tree is a binary tree with specific constraints which make it very useful in certain operations. Learn what a BST is and how we can use it 

Lecture 60  18:46  
Insertion and Lookup are operations which are very fast in a Binary Search Tree. See how they work and understand their performance and complexity.  
Section 12: Binary Tree Problems  
Lecture 61  13:02  
Find the minimum value in a binary search tree, find the maximum depth of a binary tree and mirror a binary tree. Learn to solve these problems recursively and see implementation details. 

Lecture 62  15:28  
Count the number of structurally unique binary trees that can be built with N nodes, print the nodes within a certain range in a binary search tree and check whether a certain binary tree is a binary *search* tree. Learn to solve these problems and understand the implementation details.  
Lecture 63  16:43  
Check if a path from root node to leaf node has a specified sum, print all paths from the root node to all leaf nodes and find the least common ancestor for two nodes in a binary tree. Learn to solve these problems and understand the implementation details. 

Section 13: Heaps  
Lecture 64  15:55  
Priority Queues allow us to make decisions about which task or job has the highest priority and has to be processed first. Common operations on a Priority Queue are insertion, accessing the highest priority element and removing the highest priority element. The Binary Heap is the best implementation of the Priority Queue. 

Lecture 65  12:24  
The Binary Heap is logically a Binary Tree with specific constraints. Constraints exist on the value of a node with respect to it's children and on the shape of the tree. The heap property and the shape property determine whether a Binary Tree is really a Heap. 

Lecture 66  18:02  
The Binary Heap may logically be a tree, however the most efficient way to implement it is using an array. Real pointers from parent to child and from child to parent become implicit relationships on the indices of the array.  
Lecture 67  07:42  
Let's build a real heap in Java!  
Lecture 68  19:57  
How do we ensure that when we add an element or remove an element from an existing heap, that the heap property and shape property is maintained? This operation is called Heapify.  
Lecture 69  15:14  
Once we understand heapify, adding and removing elements from a heap become very simple.  
Section 14: Revisiting Sorting  The Heap Sort  
Lecture 70  19:56  
Back to sorting. The Heap Sort uses a heap to transform an unsorted array into a sorted array. Phase I is converting the unsorted array into a heap.  
Lecture 71  19:56  
Phase II actually outputs the final sorted array. It involves removing the elements from the heap and placing it in a sorted array. The cool thing is that all of this can be done inplace.  
Section 15: Heap Problems  
Lecture 72  15:26  
Let's practice heap problems! Use the heap property to find the largest element in a minimum heap and the K largest elements in a stream.  
Lecture 73  13:09  
An interview favorite  the streaming median! This uses heaps in a very interesting and elegant way.  
Lecture 74  17:49  
Starting with K sorted lists, get one complete sorted list which includes all the elements from the original K lists. This is easy and efficient when done using heaps.  
Section 16: Graphs  
Lecture 75  14:08  
The graph is a data structure that is used to model a very large number of real world problems. It's also an programming interview favorite. The study of graphs and algorithms associated with graphs forms an entire field of study called graph theory.  
Lecture 76  07:58  
Edges in a graph can be directed or undirected. A graph with directed edges forms a Directed Graph and those with undirected edges forms an Undirected Graph. These edges can be likened to oneway and twoway streets.  
Lecture 77  14:02  
Different relationships can be modeled using either Directed or Undirected graphs. When a graph has no cycles it's called an acyclic graph. A graph with no cycles is basically a tree.  
Lecture 78  09:37  
There are a number of different ways in which graphs can be implemented. However they all follow they same basic graph interface. The graph interface allows building up a graph by adding edges and traversing a graph by giving access to all adjacent vertices of any vertex.  
Lecture 79  15:13  
An adjacency matrix is one way in which a graph can be represented. The graph vertices are rows and columns of the matrix and the cell value shows the relationship between the vertices of a graph.  
Lecture 80  18:25  
The adjacency list and the adjacency set are alternate ways to represent a graph. Here the connection between the vertices is represented using either a linked list or a set.  
Lecture 81  10:10  
Compare the adjacency matrix, adjacency list and the adjacency set in terms of space and time complexity of common operations  
Lecture 82  15:41  
Common traversal methods of trees apply to graphs as well. There is an additional wrinkle with graphs, dealing with cycles and with unconnected graphs. Otherwise the algorithms are exactly the same as those we use to traverse trees. 
Loonycorn is us, Janani Ravi, Vitthal Srinivasan, Swetha Kolalapudi and Navdeep Singh. Between the four of us, we have studied at Stanford, IIM Ahmedabad, the IITs and have spent years (decades, actually) working in tech, in the Bay Area, New York, Singapore and Bangalore.
Janani: 7 years at Google (New York, Singapore); Studied at Stanford; also worked at Flipkart and Microsoft
Vitthal: Also Google (Singapore) and studied at Stanford; Flipkart, Credit Suisse and INSEAD too
Swetha: Early Flipkart employee, IIM Ahmedabad and IIT Madras alum
Navdeep: longtime Flipkart employee too, and IIT Guwahati alum
We think we might have hit upon a neat way of teaching complicated tech courses in a funny, practical, engaging way, which is why we are so excited to be here on Udemy!
We hope you will try our offerings, and think you'll like them :)