Beginning Java Data Structures and Algorithms
0.0 (0 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
10 students enrolled

Beginning Java Data Structures and Algorithms

Sharpen your problem solving and data organization skills using Java data structures and algorithms
0.0 (0 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
9 students enrolled
Created by Packt Publishing
Last updated 1/2019
English
English [Auto-generated]
Current price: $129.99 Original price: $199.99 Discount: 35% off
3 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 3 hours on-demand video
  • 1 downloadable resource
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • Understand some of the fundamental concepts behind key algorithms
  • Express space and time complexities using Big O notation.
  • Correctly implement classic sorting algorithms such as merge and quicksort
  • Correctly implement basic and complex data structures
  • Learn about different algorithm design paradigms, such as greedy, divide and conquer, and dynamic programming
  • Apply powerful string matching techniques and optimize your application logic
  • Master graph representations and learn about different graph algorithms
Course content
Expand all 35 lectures 02:55:48
+ Algorithms and Complexities
6 lectures 24:14

In this course you will learn everything the concepts pertaining to algorithms and data structures. Each of the popular examples of algorithms, such as bubble, quick, and merge sort can help you relate the algorithms to familiar programming scenarios such as arranging numbers in ascending or descending order, and so on. Detailed explanations of data structure examples such as binary tree, hash table, and graphs will help you understand the improved efficiency of the program.   

Preview 02:29

In this lesson, we will examine different types of algorithms and discuss how the performance varies in each. We will discuss what makes an algorithm more efficient than another and how to express the complexity of each.   

Lesson Overview
01:09

An algorithm can be seen as a roadmap or a set of instructions to accomplish a well defined task. In this topic, we will build a simple example of one such algorithm to help us get started.   

Developing Our First Algorithm
03:05

Algorithmic complexity is a way to describe the efficiency of an algorithm as a relation of its input. It can be used to describe various properties of our code, such as runtime speed or memory requirements. It's also a very important tool programmers should understand to write efficient software. In this topic, we will start by describing a scenario, introducing the topic, and then dive into the details of the various types of complexities and the different techniques to measure them.   

Measuring Algorithmic Complexity with Big O Notation
10:21

In this topic, we shall look into examples of different complexities. This is important so that we can learn to recognize algorithms that belong to different complexity classes and possibly attempt improving the performance of each.   

Identifying Algorithms with Different Complexities
06:42

This section summarizes what we have learned in the earlier topics.   

Preview 00:28
Test Your Knowledge
4 questions
+ Sorting Algorithms and Fundamental Data Structures
6 lectures 31:43

In the previous lesson, we saw how the intersection problem can be improved by using a sorting algorithm. This is common with many problems. If the data is organized in an ordered manner, a more efficient algorithm can be developed. In this lesson, we will start by exploring three types of sorting techniques, which are bubble, quick, and merge sorting. Later, we will learn different ways to organize data using fundamental data structures.   

Preview 01:01

Bubble sorting is the simplest sorting algorithm. In this topic, we will learn about it in more detail.   

Introducing Bubble Sorting
04:48

Algorithmic complexity is a way to describe the efficiency of an algorithm as a relation of its input. It can be used to describe various properties of our code, such as runtime speed or memory requirements. It's also a very important tool programmers should understand to write efficient software. In this topic, we will start by describing a scenario, introducing the topic, and then dive into the details of the various types of complexities and the different techniques to measure them.   

Understanding Quick Sort
08:25

Although the quicksort on average is pretty fast, it still has the theoretical worst time complexity of O(n²). In this topic, we shall examine another sorting algorithm, called merge sort, in which the worst time complexity is O(n log n). Similar to quick sort, merge sort belongs to the divide and conquer class of algorithms.   

Using Merge Sort
03:39

Data structures are a way to organize data so that it is efficiently accessible for the problem you are trying to solve. Choosing the right data structure will depend on the type of problem you're trying to solve , the amount of data you need to organize, and the medium you use to store your data. We have already seen and used one example of a data structure. In the preceding topics, we have made extensive use of arrays. Arrays are the most primitive of data structures. They provide access to your data using an index and are fixed in size. This is opposed to other dynamic data structures that can grow and make more space for data whenever it's needed.   

Getting Started with Fundamental Data Structures
13:10

This section summarizes what we have learned in the earlier topics.   

Summary
00:40
Test Your Knowledge
6 questions
+ Hash Tables and Binary Search Trees
6 lectures 32:10

In the preceding lesson, we introduced the concept of data structures by looking at arrays, linked lists, queues, and stacks. In this lesson, we will use some of these primitive structures to build more complex ones. We'll start the lesson by looking at hash tables, which are useful data structures for fast key-value lookup. In the second part of the lesson, we will learn about a more complex data structure that supports range queries, called binary trees.   

Preview 01:07

Hash tables provide us with a fast data structure for organizing these key value pairs and implementing our data dictionary. They are useful in a wide variety of applications due to the quick lookup and ease of use for in-memory data storage. Insertion and search operations have a typical average runtime complexity of O(1). Let’s look at it in more detail.

Introducing Hash Tables Part 1
11:33

In this lesson, we will be looking at remainder and multiplication hash functions.   

Introducing Hash Tables Part 2
05:52

Like hash tables, binary search trees are fast lookup data structures for organizing key value pairs and implement the data dictionary operations. In addition to providing insert, search, and delete, binary tree supports efficient querying such as finding minimum and maximum, successor, and predecessor. When using balanced binary search trees, insert and search operations have a worst-case runtime complexity of O(log n). This is a big theoretical improvement over the worst-case scenario of a hash table, which is O(n).

Getting Started with Binary Search Trees
07:12

Traversing a binary tree is the process of stepping through each node of the tree and performing some sort of action on the data contained in the node. Let’s learn about it in detail.   

Traversing a Binary Search Tree
06:00

This section summarizes what we have learned in the earlier topics.

Summary
00:26
Test Your Knowledge
8 questions
+ Algorithm Design Paradigms
5 lectures 22:25

In the last lesson, we learned about hash tables and binary search trees. Today, we will start with the algorithm design paradigms. These design patterns can be seen as the generic methods or approaches that motivate the design of a class of algorithms. Just as an algorithm is a higher abstraction than a computer program, an algorithm design paradigm is an abstraction higher than an algorithm. The choice of an algorithm paradigm is an important one when designing an algorithm.

This lesson will focus on the following three algorithm paradigms:

  • Greedy

  • Divide and conquer

  • Dynamic programming

Preview 01:41

Algorithms typically go through a sequence of steps, wherein each step you have a set of choices. Greedy algorithms, as the name suggests, follow the heuristic of making the locally optimum choice at each step, with the hope of arriving at a global optimum. To better understand what we mean by this, let's introduce a problem.   

Introducing Greedy Algorithms
08:54

In lesson two, we introduced, among other sorting algorithms, merge and quick sort. A peculiarity of both algorithms is that they divide the problem into subproblems that are smaller instances of the same, solve each one of the subproblems recursively, and then combine the solutions to the subproblems into the solution for the original problem. This strategy forms the basis of the divide and conquer paradigm.

Getting Started with Divide and Conquer Algorithms
06:13

After greedy and divide and conquer, we will turn our attention to dynamic programming. Dynamic programming is an algorithm design paradigm that also attempts to solve optimization problems by combining solutions with subproblems. Unlike divide and conquer, subproblems need to exhibit optimal substructure for dynamic programming to be applicable.   

Understanding Dynamic Programming
05:18

This section summarizes what we have learned in the earlier topics.   

Summary
00:19
Test Your Knowledge
6 questions
+ String Matching Algorithms
5 lectures 17:20

String matching algorithms are quite common in text-editing programs. These kind of programs frequently need to find all occurrences of a pattern in the text, where the text is usually the document being edited and the pattern is a word supplied by the user. Since text-editing programs aim to be responsive, having efficient algorithms to solve the string matching problem is fundamental.   


Preview 01:01

In this topic, we will learn about naive search algorithms.   

Naive Search Algorithms
04:02

The Boyer-Moore string searching algorithm was introduced by Robert S. Boyer and J. Strother Moore in 1977, and builds upon the naive search algorithm by intelligently skipping certain sections of the text. One key feature of the algorithm is that it matches the pattern from right to left, instead of left to right, using to its advantage a couple of shift rules that improve its running time. To understand the effect of these rules, let's build the Boyer-Moore algorithm from our naive search algorithm.   

Getting Started with the Boyer-Moore String Searching Algorithm
08:54

Even though the Boyer-Moore string search algorithm is the standard benchmark for practical string search literature, there are other string matching algorithms that are also suitable for different purposes. In this small topic, we present the following three, which are the most famous ones:

  • Rabin-Karp

  • Knuth-Morris-Pratt

  • Aho-Corasick

However, we will only give out the implementation of Rabin-Karp.

Introducing Other String Matching Algorithms
02:59

This section summarizes what we have learned in the earlier topics.

Summary
00:24
Test Your Knowledge
6 questions
+ Graphs, Prime Numbers, and Complexity Classes
7 lectures 47:56

Graph problems are very common in computer science, and their applications pervade many real-life applications. Everything that can be represented by entities and their relationships can ultimately be modeled by a graph. How we connect with friends on social media, how route–planning applications are able to find the shortest route, and how e-commerce websites are able to provide us with recommendations are all examples of problems modelled by graphs.   

Preview 03:04

There are usually two standard ways to represent a graph G = (V, E) in a computer

program:

  1. As a collection of adjacency lists.

  2. As an adjacency matrix.

You can use either way to represent both directed and undirected graphs. We'll start by looking at the adjacency list representation.

Representing Graphs
06:19

A common activity on a graph is visiting each vertex of it in a given order. We will start by introducing the breadth-first search, and then follow with depth-first search. Both of these techniques form the archetype for many important graph algorithms, as we will see later with the cycle detection and Dijkstra's algorithm for single-source shortest paths.   

Traversing a Graph
13:20

The shortest path is a path between two vertices so that the sum of the weights of the edges that constitute the path is minimized. The shortest path problem has many applications in the real world, from finding directions on mapping applications to minimizing the moves to solve a puzzle. In this topic, we shall look at two different strategies for computing shortest paths: one that finds the shortest paths from a single source to every other vertex in the graph, and another that finds shortest paths between all pairs of vertices in a graph.   

Calculating Shortest Paths
12:58

A prime number is a natural number greater than one whose only divisors are one and the number itself. Prime numbers play a very important role in the fundamental theorem of arithmetic: every natural number greater than one is either a prime or a product of primes. Nowadays, number-theoretic algorithms are widely used, mainly due to cryptographic schemes based on large prime numbers. Most of those cryptographic schemes are based on the fact that we can efficiently find large primes, but we cannot factor the product of those large primes efficiently.

Prime Numbers in Algorithms
02:46

In this lesson, we covered ways of representing and traversing a graph and looked at shortest path algorithms. Graphs are also an optimal data structure for some problems we haven't mentioned yet. This topic aims to introduce some of them.   

Other Concepts in Graphs
08:16

This section summarizes what we have learned in the earlier topics.   

Summary
01:13
Test Your Knowledge
9 questions
Requirements
  • Basic knowledge of Java, mathematics and object-oriented programming techniques
Description

Learning about data  structures and algorithms gives you better insight on how to solve  common programming problems. Most of the problems faced every day by  programmers have been solved, tried, and tested. By knowing how these  solutions work, you can ensure that you choose the right tool when you  face these problems.


This course teaches you tools  that you can use to build efficient applications. It starts with an  introduction to algorithms and big O notation, later explains bubble,  merge, quicksort, and other popular programming patterns. You'll also  learn about data structures such as binary trees, hash tables, and  graphs. The course progresses to advanced concepts, such as algorithm  design paradigms and graph theory. By the end of the course, you will  know how to correctly implement common algorithms and data structures  within your applications. 


About the Author

Kristian Secor has been in  the industry for quite a while. He a M.S. focusing on web development  and an Ed.D focusing on educational technology. He has been developing  for about 20 years and has been teaching for about 16 years at the  University level for the frontend, backend, and mobile application  courses. Currently, he is working as the Program Head for the Web  Development Department at San Diego Mesa College. He is also an  instructor for both user experience and frontend development at U.C. San  Diego Extension. 


Who this course is for:
  • If you want to better understand common data structures and algorithms by following code examples in Java and improve your application efficiency, then this is the course for you.