The ease of any given programming language is highly subjective, and not concretely related to its “level” on the abstraction ladder. For example: Functional languages like Elm and Haskell are extremely high level, but many people find them difficult to use when compared to more common procedural languages like Python and C#.
As for “actual programming”: Programming is programming regardless of the language. The only meaningful distinction between low and high level is in the conceptual distance between the algorithms and the actual machine code that compilers generate (or interpreters execute).
I’m currently in the process of learning languages like C# because if I just learned Python/ Lua, I wouldn’t understand the fundamentals of programming, and therefore would not be a good programmer.
Can you define the “fundamentals of programming”, and explain how C# would help you understand them?
If you just want to learn the minutia of memory management, and some generalities about how the OS executes your code, C# won’t help you, because it’s not a low level language (in any way), and it doesn’t expose those details. Learning C would be more instructive in that sense, but if you really want to learn things on a deeper level, I would recommend this book: http://www.amazon.com/Elements-Computing-Systems-Building-Principles/dp/0262640686
However, if you want to become a “good programmer”, you don’t have to chase low level details. That kind of knowledge can help in certain situations, but unless you’re working in a highly specialized area, those situations are rare. In the modern software ecosystem, success is largely about understanding larger systems, and being able to fit them together into something cohesive.
Ultimately, a good programmer delivers useful products/services, in a timely manner. If you need to know low level details to deliver working software, then those details are useful to know. If not, then you’re basically wasting your time.