I think "scripting language" is an awful word, that is extremely outdated or at best suits a class of domain specific languages. Your teacher is just aligning everything he clearly doesn't have enough understanding about into an axis of evil.
A sensible distinction to make is that between high level languages and low level languages, or between statically and dynamically typed ones, which are truly orthogonal.
Assembler is low level dynamically typed (if speaking of types makes any sense at all), C is low level statically typed, Ruby is high level dynamically typed, Haskell is high level statically typed. Java is neither high nor low level statically typed, C++ is both high and low level statically typed. And so on.
The discussion can only be, which paradigms are more suitable for an entry level programmer.
I am quite convinced low level programming probably isn't one. It might have been, some time back in the early 90s, when you could actually produce interesting results in reasonable time with it.
But programming is fueled by passion. Passion is nourished by rewards. Therefore, entry level programmers should start with rewarding tools. Low level tools are no longer rewarding, because there's a vast sea of high level tools that get you the same result in a fraction of the time.
Human thinking is abstract. As we learn to understand the world, we do so by very coarse grained abstractions and we go into detail as needed.
For a child to understand its environment, you're not going to teach it mathematics, then physics, then chemistry, then biology, then history, sociology and philosophy. You give it a very simple model of the world to cope with and will, by itself long to reach past it, endlessly firing questions at you when young and completely negating your authority later.
That is how we think. The human brain can only process limited amounts of information "units", but the degree of abstractness matters little in the quantization of information. For example: reading the expression '34*75' to us is simpler to us than calculating it, whereas for computers it's the other way around. To recognize (and thereby abstract) a bunch of black pixels into a squiggly line, which can then be recognized (and thereby yet again abstracted) to be an individual digit is a tremendous amount of work.
My grandmother understands the idea of opening a file. However she has no understanding beneath that level. And frankly, if she had had to learn this by first studying the internal workings of the hardware and the operation system and what not, she never would have gotten there.
There's a lot of people out there, who overcomplicate things, because they were never taught to think in terms of clear, concise and thereby elegant solutions, but spent too much time bothering with exchangeable low level details and solving problems against those. Teaching people to think like computers is the worst possible approach to programming.
The value of programming lies in finding a solution to a problem. Expressing it as code is really more of a dull, mechanical task and should simply be done with whatever tools are fit.
Oh, and don't worry about not having understood pointers. I had about the same problem at the same age. The problem here is also the lack of abstraction. Classically you learn about pointers from some C book and while you're struggling to understand them, this goes hand in hand with memory allocation and thus with stack and heap memory and so on. The abstract concept behind pointers is indirection. A variable, that holds an index into a specific array is just that (actually it's really the same in C, where the specific array is your address space), and you do not need pointer arithmetics for this.
This is just meant to illustrate, that choosing high level of abstractions makes things a lot easier to grasp.
EDIT: and when it comes to typing, I prefer statically typed languages. And I think entry level programmers should clearly understand the concept of types (which is an abstract one).