Everything that you do with a computer relies in some way on an algorithm. In fact, any softwares (even the most advanced Artificial Intelligence) are only algorithms put together with some structured data. That's it: Algorithms + Data Structures = Softwares.
Algorithms define how to operate, what to do exactly to solve a problem. If we relate it to mathematics or physic, then it could mean solving an equation step by step with a recipe (e.g. Robot Resolving 1st order equation).
Learning algorithms does not require you to know any programming language (algorithms remain the same, only the syntax may change); it only requires you to have an understanding of the steps that are involved. It is however recommanded to know one if you want implement your own algorithms and see them running.
We will learn how to implement ourself very usefull and powerful algorithms. We will cover programming techniques such as : complexity, divide and conquer strategy, recursion, fusion, optimization, string rewriting systems, partition method...
Once you handle some algorithms concept you may go further with data structures, complexity, a new programming language and create your own piece of software.
Understanding algorithms gives you a great vision of the numerical world. Without understanding them it is difficult to see things from higher perspective, to predict what could efficiently works or produce unacceptable results.
The more we know about algorithms, the better our chances are of finding a good way to solve a problem. In many many cases, a new problem can be reduced to old problems without too much effort.
Many of the problems, though they may not seem realistic, require the same set of algorithmic knowledge that comes up every day in the real world.
They are way easier to understand than you can imagine
Algorithms are almost always associated to mathematics and all the obscurantisme it inspires. The math is useful but you won't need it most of the time.
Don't spend any time memorizing algorithms
That's not the point: instead, try to understand how different algorithms approach different problems. See what makes one approach slow while the other is fast and learn what the tradeoffs are. The key is to get insight in how we can make computers do things.