I personally believe that graphical based programming is the future. I can totally agree with LabVIEW's deficiencies, and the future may or may not include LabVIEW.
As a professional LabVIEW developer, I think about things like memory allocation, and efficiency all day long. LabVIEW just helps you to see the programming structures from a high level. I actually like reading the white papers on the LabVIEW compiler, and optimizations. There are a lot of docs/posts that explain some of the detailed guts of LabVIEW. (And how to use them to your advantage.) For example, one project that I know of, processed several very large images, hundreds of times per second. It was using the images to detect product flaws on the millimeter scale. 8 cores, and several image acquisition cards where used, and none where NI stamped. (PS the hardware setup took longer then the software development)
I also learned C and C++ before LabVIEW, and can't stand it anymore, especially now that I have looked at Iphone development. (Yes, I know that it is Apple's flavor of ObjectiveC

)
Despite all of this, I can agree that LabVIEW is not used nearly as much as other languages. Exposure to other languages is very important.
Here is a video that might answer all of the "early" issues that new developers may complain of.
http://www.youtube.com/watch?v=4BppvSzsrNk&fmt=18