Introduction to programming it only gets worse doesn't it.

Status
Not open for further replies.
Top 20 replies by Programmers when their programs don't work

20. "That's weird..."

19. "It's never done that before."

18. "It worked yesterday."

17. "How is that possible?"

16. "It must be a hardware problem."

15. "What did you type in wrong to get it to crash?"

14. "There is something funky in your data."

13. "I haven't touched that module in weeks!"

12. "You must have the wrong version."

11. "It's just some unlucky coincidence."

10. "I can't test everything!"

9. "THIS can't be the source of THAT."

8. "It works, but it hasn't been tested."

7. "Somebody must have changed my code."

6. "Did you check for a virus on your system?"

5. "Even though it doesn't work, how does it feel?

4. "You can't use that version on your system."

3. "Why do you want to do it that way?"

2. "Where were you when the program blew up?"

And the Number One reply by programmers when their programs don't work:

1. "It works on my machine."

I, uh


I think I said half of those today while demoing something I've been working on.
 
Really, the professor can do better than this for his intro-level students. S/he can set up a common Java dev environment for all their students to use via a distributed virtual machine image. Tools that help you do this (i.e. Vagrant) are pretty common nowadays.

Then again, learning how to do all of this setup stuff and understanding how to manage development/deployment environments and all that, is a valuable skill. Learning that might help avoid the scenario I described in my last post.

Just maybe not in an intro class.
 
Mono! Unless you're learning Java...



Depends on what you're going for.



Just VS Code. There's always Mono
I would say there's not much reason to use mono anymore when Roslyn is open source and cross platform. Why use Mono when you can get the real thing?

Edit: oh you're recommending mono develop. I have no experience with mono develop so I have no opinion there.
 
Java is a great starting language. It's not about the actual language but the concepts.

Exactly. Python is a good language too although I know almost nothing about it.

Ehh... neither are that great to be honest. The only thing they are good for is getting people hands on with control flow and pass by reference vs pass by value.

Java has a pretty awful object system, and python has dynamic typing which is probably the most god awful feature to ever be popularized in computer science.
 
Really, the professor can do better than this for his intro-level students. S/he can set up a common Java dev environment for all their students to use via a distributed virtual machine image. Tools that help you do this (i.e. Vagrant) are pretty common nowadays.

Then again, learning how to do all of this setup stuff and understanding how to manage development/deployment environments and all that, is a valuable skill. Learning that might help avoid the scenario I described in my last post.

Just maybe not in an intro class.

Most CS programs I've looked at seem to give students shell accounts where they have the tools required for their basic classes available. gcc/g++, javac, gprolog, et al. I'm surprised that OP doesn't seem to have that.
 
I went through C++ procedural a year ago. This semester i have C++ object oriented AND ARM Cortex M microcontroller assembly to learn!!! FUCCCCCKKKK

say-low-level-one-more-goddamn-time.jpg
There are few things more enlightening when learning to code than the first time you learn an assembly language. Don't treat it as something you don't need.
 
Most CS programs I've looked at seem to give students shell accounts where they have the tools required for their basic classes available. gcc/g++, javac, gprolog, et al. I'm surprised that OP doesn't seem to have that.

That's what I thought too. Last course I took was in machine learning on Coursera, and the professor there just distributed a Linux VM with GUI and everything for us to work all the projects from. It not only helps the students, but it helps the teacher avoid the "works on my machine" issue.
 
Java still has a special place in my heart for its almost inane specificity.

Public static void main and all that

Also, you don't have to worry about memory management, it's very OOP, and knowledge of its syntax can transfer to many other languages. So part of me is glad I was first taught Java. Part of me.
 
Top 20 replies by Programmers when their programs don't work

20. "That's weird..."

19. "It's never done that before."

18. "It worked yesterday."

17. "How is that possible?"

16. "It must be a hardware problem."

15. "What did you type in wrong to get it to crash?"

14. "There is something funky in your data."

13. "I haven't touched that module in weeks!"

12. "You must have the wrong version."

11. "It's just some unlucky coincidence."

10. "I can't test everything!"

9. "THIS can't be the source of THAT."

8. "It works, but it hasn't been tested."

7. "Somebody must have changed my code."

6. "Did you check for a virus on your system?"

5. "Even though it doesn't work, how does it feel?

4. "You can't use that version on your system."

3. "Why do you want to do it that way?"

2. "Where were you when the program blew up?"

And the Number One reply by programmers when their programs don't work:

1. "It works on my machine."

Yeah, 20/20.
 
I don't know why they are still teaching Java in introductory class.

Because no matter how many options might be better in a vacuum now, there are enough resources for Java, and it does enough of the things you need in an intro class, to still have a strong presence.

Kinda weird to see all the hate for C++ on a gaming forum when games are primarily written in C++.... There's a reason they don't write them in Java, C#, etc.

Yes, but that reason has nothing to do with what works well when teaching people fundamentals.
 
Ehh... neither are that great to be honest. The only thing they are good for is getting people hands on with control flow and pass by reference vs pass by value.

Java has a pretty awful object system, and python has dynamic typing which is probably the most god awful feature to ever be popularized in computer science.

...wut.
 

It's not a super uncommon position. Dynamic typing requires a ton of (slow, performance-killing) work on the back end, potentially promotes sloppy programming, and makes it almost impossible to actually mathematically prove anything about program correctness; there's lots of things that make it sloppy compared to statically-typed languages.

Of course, since programming is actually much more about solving real problems efficiently than it is about technical concepts of correctness, the power and flexibility of dynamically-typed languages is actually incredibly useful. But it makes sense why people have a problem with it.
 
It's not a super uncommon position. Dynamic typing requires a ton of (slow, performance-killing) work on the back end, potentially promotes sloppy programming, and makes it almost impossible to actually mathematically prove anything about program correctness; there's lots of things that make it sloppy compared to statically-typed languages.

Of course, since programming is actually much more about solving real problems efficiently than it is about technical concepts of correctness, the power and flexibility of dynamically-typed languages is actually incredibly useful. But it makes sense why people have a problem with it.

It seems uncommonly silly to me. Dynamic vs. static in almost all situations should come down to a matter of preference. I feel like someone who really thinks it's the worst thing ever also isn't knowledgeable enough about the compilation process to make a reasoned argument about type checking optimality. I mean, all this stuff is just a construct anyway. It's not like types exist at assembly level or lower.
 
It seems uncommonly silly to me. Dynamic vs. static in almost all situations should come down to a matter of preference. I feel like someone who really thinks it's the worst thing ever also isn't knowledgeable enough about the compilation process to make a reasoned argument about type checking optimality. I mean, all this stuff is just a construct anyway. It's not like types exist at assembly level or lower.

Types do exist at assembly level when you're dealing with integer and floating point ALUs. Never mind that types exist explicitly to resolve a group. Dynamic typing, yes has been a boon to productivity and people who may not be as apt at dealing with finely developed type systems. More often than not, you want type inference instead of dynamic typing. Dynamic typing is a serious lapse in language correctness. If you're going to have weak typing at all, at least have it be static. All this stuff is a construct anyways? Yes, of explicitly types structs and objects. The second you allow a data construct to be manipulated as something it's not you've throw correctness out the window.

Dynamic vs static should never come down to preference, it should come down to what your project demands. Dynamic typing exist because some people can't be fussed with important details, and it's simply good enough. Doesn't mean this is a good thing though, and dynamic typing has put us on a serious language detour for the past 20 years. To toss it back to you, I feel like someone who really thinks they can handwave types out of a programmer's mental tax also isn't knowledgeable as to why a programmer should pay that mental tax. Your mentality is the very one that is ingrained in a lot of people that start with Python or Ruby as their first language, and it's one that is naive to compilation process, not enlightened to it. The people that make dynamic languages know well the fire they are playing with, but adopters often do not.
 
setting up a dev environment can be one of the hardest parts of a new dev job

dont worry op. you'll get desensitized to it and get damn surprised when an employer or a class streamlines it.
 
Here's what I learned. If you have no programing experience, the first day is always the worst.

Not even knowing what to do or what commands to enter or anything, is the god damn worst.

The programs certainly get harder and more intense and often frustrating, but nothing beats that first initial day of shock.

And if you're a CS major I think you'll find layer on in your classes, you will be doing more Computer Science (logic, digital controling, etc) then programing. But I'm not a CS major, just a physics who had to take coding classes, so I may be talking out of my ass.

Best of luck either way.
 
I've been spending free time setting up Vagrant webservers for my coworkers to get around degugging stuff like this.
 
Java problems boils down to memory management, really.

It's garbage. And not collector.

It's a fucking PAC-MAN.

Because if we start talking about sintax or it's OOP implementation, then we could criticize every fucking language out there.

My work PC have Linux Mint and Eclipse with DD4 16GB and I still get slowdowns while debugging shit.

Other languages that I like are C# and Javascript, in general. Made some small games using C# while learning and it's pretty similar to Java, so I'm glad most universities get it right using it as learning language.

Also, the IT world is full of jobs for Java programmers, so you won't really have a problem finding good jobs that use Java.
 
I'm relatively new to programming as a career, but why do you think this?
I could write pages about why. but the short of it is that oop is very hard to do well and most people simply don't understand it even after years of experience(me included). But even if you do, that doesn't mean using it will be advantageous to the project anyway.
In my opinion oop is so popular because it's abstract but kinda falsely seems to make a lot of sense to beginners and because it's easy to build hierarchical structures around for a business.

like all paradigms it has its place but in my opinion it's widely overused, misused and not well understood.

This sounds awesome. However, it'd be nearly useless for people that take intro programming courses to supplement another STEM degree, which is probably why most universities don't do this. But for people who actually want to learn programming/computer science, it sounds wonderful.
yeah, I'm not sure how I feel about it but I can understand that perspective.
 
I could write pages about why. but the short of it is that oop is very hard to do well and most people simply don't understand it even after years of experience(me included). But even if you do, that doesn't mean using it will be advantageous to the project anyway.
In my opinion oop is so popular because it's abstract but kinda falsely seems to make a lot of sense to beginners and because it's easy to build hierarchical structures around for a business.

like all paradigms it has its place but in my opinion it's widely overused, misused and not well understood.

Most paradigms are not misused and not well understood by novices. The solution is not to abandon oop, it's to accept the fact that everything has a learning curve. A novice programmer writing purely procedural code will write just as bad or worse code than a novice writing object oriented code.
 
It seems uncommonly silly to me. Dynamic vs. static in almost all situations should come down to a matter of preference. I feel like someone who really thinks it's the worst thing ever also isn't knowledgeable enough about the compilation process to make a reasoned argument about type checking optimality. I mean, all this stuff is just a construct anyway. It's not like types exist at assembly level or lower.

Dynamic types can cause a lot of problems from my experience when you're mixing types when you don't mean to. I can only imagine the problems it causes when someone learns on that and gets sloppy with it because they never learned to be disciplined about it.

those are mathematical processes, everything is ultimately a coin flip

I've know people who were great at math and couldn't hack at programming even though they tried to go into CS, and then people who weren't great at math who did pretty well with coding. I find the mentality for doing math problems and coding to be a bit different even if they share common logical ways of thinking. I just find code structure to be different enough that I don't think being good at math is a requirement.
 
The second you allow a data construct to be manipulated as something it's not you've throw correctness out the window.

The web is built on text. Dynamic typing is what keeps the whole thing running. The idea that it enables some small productivity boost for people who aren't "apt" enough to handle static typing is pretty silly; this entire realm of software relies on the benefits of dynamic type systems as a fundamental aspect of how they operate.
 
There are a shit load of growing pains when you start to learn development. You'll feel like an idiot for probably the first few months, then you transition to feeling like you're just behind everyone else.

Tough it out. Every time you get stuck for a few hours, ask for help. Take notes, watch videos, Google -a lot-.

It gets better. Usually because you become a bit masochistic.

This.....
 
At one point in my life I seriously entertained the idea that I could be a programmer. I took a C++ class in high school which I hated, though not the subject necessarily but the class itself (I was a senior and it was full of dumbass freshmen at the time). Wound up getting a C (heh) or something.

I thought I'd give programming another shot in college, and took a class on QBasic. Had much more fun, and was an overall more enjoyable experience. The next semester I tried to take Pascal, and that, sadly, didn't turn out to be as fun. We got into all kinds of weird, complicated shit and I decided that the subject is probably not for me. Ah well.
 
So I'm on my first day of intro to programming. The very first assignment is to create the Hello World "program" and I can't even get javac to work. I googled my error- 'javac' is not recognized as an internal or external command, operable program or batch file- and set my PATH in environmental variables and it still doesn't work. This isn't my PC, but I installed Java SE. It's Windows 8, but that doesn't seem to matter. It's very frustrating to be stuck on what seems to be a very simple problem. Really wish this wasn't an online class.

Yeah, that's gonna suck for you OP. At uni, I always found that having someone in the lab with me to help me talk through a program and debug it with my brain was a lot better than just reading notes/slides.

Then again, I kinda coasted through my actual programming classes but still ended up getting a good Computer Science degree.
 
Well at least you're learning Java, so things can't get that much worse yet*. Just wait until you move on to C and they start babbling on about pointers. That's when you know things are truly starting to get worse.

*I think. Disclaimer, I've never used Java before, but I hear it's similar to C# so that's what I'm basing this off.
 
The web is built on text. Dynamic typing is what keeps the whole thing running. The idea that it enables some small productivity boost for people who aren't "apt" enough to handle static typing is pretty silly; this entire realm of software relies on the benefits of dynamic type systems as a fundamental aspect of how they operate.

The industry leans on it, because it's there. This is like an appeal to tradition or appeal to populace.

There is very much a reason why Ruby, Python, and JS are on the decline. There is very much a reason why we see more webtech being driven by Java, Scala, Go, D, and even back to C++ as C++11/14.

You don't need dynamic typing to parse all the text that builds the web. In fact you want clear boundaries for the cases when you parse out some numeric quantity and want to manipulate it. Nevermind all the text processing, fuzzy matching, and high use of regex very much asks for highly parallelizable compiled code. It's why Scala is so huge, it's why Mozilla is working hard on WebAssembly, Emscripten, and Rust.

There is simply an acknowledgement that dynamic typing has created a huge array of problems, because it was pushed so hard at the start of the web revolution. Just because that is what it's come to be built on, doesn't mean that it isn't a mistake or that we shouldn't strive to move away from it.
 
Most paradigms are not misused and not well understood by novices. The solution is not to abandon oop, it's to accept the fact that everything has a learning curve. A novice programmer writing purely procedural code will write just as bad or worse code than a novice writing object oriented code.

I kept my response short because I don't want to get into a paradigm-war. Like I said, I could write pages about it. Also I didn't mean day one novices but people with years of experience.
 
My Programming class coming up has us learning C#, but I want to get ahead of the game. I'm a straight beginner. Is there any books/videos/exercises I can do to help straight out the steep learning curve?
 
Well at least you're learning Java, so things can't get that much worse yet*. Just wait until you move on to C and they start babbling on about pointers. That's when you know things are truly starting to get worse.

*I think. Disclaimer, I've never used Java before, but I hear it's similar to C# so that's what I'm basing this off.
I don't remember pointers ever being a big deal to me. I'm still not sure why pointers trip some people up. I can understand getting tripped up by many other parts of programming, but not pointers.
 
Ehh... neither are that great to be honest. The only thing they are good for is getting people hands on with control flow and pass by reference vs pass by value.

Java has a pretty awful object system, and python has dynamic typing which is probably the most god awful feature to ever be popularized in computer science.

I'm guessing you're going to suggest C or C++ as an intro...so that newbies are more worried about what a seg fault is and whether they're passing something by value or reference rather than actual programming concepts.

I think any language where people can be tripped up on the lnagauage specific syntactical differences isn't a good one to start with. How does knowing how pointers work help you understand inheritance or recursion?

Also you're going to need some prrof that JS and ptyhon are on the decline. I find that extremely hard to believe. Python is still the scripting language of choice
 
I'm guessing you're going to suggest C or C++ as an intro...so that newbies are more worried about what a seg fault is and whether they're passing something by value or reference rather than actual programming concepts.

I think any language where people can be tripped up on the lnagauage specific syntactical differences isn't a good one to start with. How does knowing how pointers work help you understand inheritance or recursion?

Also you're going to need some prrof that JS and ptyhon are on the decline. I find that extremely hard to believe. Python is still the scripting language of choice

I would never recommend C++. C maybe, but not C++. C++ just has too many gotchas. C, on the other hand, is a small language. If you want to teach OO, I would probably suggest Java or C# as the beginner languages (with an edge to C#, but that's my preference).
 
I kept my response short because I don't want to get into a paradigm-war. Like I said, I could write pages about it. Also I didn't mean day one novices but people with years of experience.

I'm not disagreeing with you about that, I just think it's common for people to remain novices for many years. Sometimes forever. It's more about the person than the paradigm
 
I'm guessing you're going to suggest C or C++ as an intro...so that newbies are more worried about what a seg fault is and whether they're passing something by value or reference rather than actual programming concepts.

I think any language where people can be tripped up on the lnagauage specific syntactical differences isn't a good one to start with. How does knowing how pointers work help you understand inheritance or recursion?

Also you're going to need some prrof that JS and ptyhon are on the decline. I find that extremely hard to believe. Python is still the scripting language of choice

Nah, I'd probably still suggest Java, but I don't think it's that great. But I'd touch on C/C++ early. Preferably C, OO can come from elsewhere. Python is on the decline for web backend. Same for ruby. JS isn't, but the writing's on the wall. Mozilla, Google, Amazon, Apple have all been taking steps for at least the past 4 years to move off of JS. It's just too ubiquitous at the moment. JS is around because of efforts that paved its success starting in the 90's and being the only "enough mature" thing come the early 2000's.

Go, although not suited to replace C, is a perfect example of what people are moving to. Strong static typed compiled languages with GC and options for thread safety. Python offers none of those. Even in Scientific computing people are moving off of Python for Julia, or pushing Python code into FFI C code. Then even Rust is designed so Ruby and Python users can pick it for all their scripting, or use it with FFI.

But of course when a language is that prominent it's not going away overnight. Java I'm slightly less critical of, but it's a cumbersome ugly language. In some ways it's failed as a programming language, but succeeded in proving VM languages. Thus we have Scala, Clojure, Ceylon, Kotlin, and X10 all targeting the JVM. C# has proved to be a cleaner version of Java, and now we are seeing it and F# open sourced, with official MS endorsement of Xamarin. Google is moving off of Oracle's libraries, and already has their own VM on Android.

However I think you're being unfair when you say things like learning inheritance or recursion is usurped by dealing with pointers. Stack and heap memory management, pointers arithmetic are usurped by learning other languages first just the same. Basic control flow is basically the same in every language sans the functionals. So yeah, I begrudgingly give a pass to Java becomes I'm not a sadist, but I think recommending Python is a huge disservice to people.
 
Top 20 replies by Programmers when their programs don't work

20. "That's weird..."

19. "It's never done that before."

18. "It worked yesterday."

17. "How is that possible?"

16. "It must be a hardware problem."

15. "What did you type in wrong to get it to crash?"

14. "There is something funky in your data."

13. "I haven't touched that module in weeks!"

12. "You must have the wrong version."

11. "It's just some unlucky coincidence."

10. "I can't test everything!"

9. "THIS can't be the source of THAT."

8. "It works, but it hasn't been tested."

7. "Somebody must have changed my code."

6. "Did you check for a virus on your system?"

5. "Even though it doesn't work, how does it feel?

4. "You can't use that version on your system."

3. "Why do you want to do it that way?"

2. "Where were you when the program blew up?"

And the Number One reply by programmers when their programs don't work:

1. "It works on my machine."

I work in QA, this sums up my interactions with devs 😁
 
There is very much a reason why Ruby, Python, and JS are on the decline. There is very much a reason why we see more webtech being driven by Java, Scala, Go, D, and even back to C++ as C++11/14.

I'm talking about what actual people do when building actual projects in the real world here. I know there are plenty of corners on the internet where people will talk about The Coming Scala Revolution (or whatever) and believe it means something, but the reality is that the web runs on JavaScript, PHP, Python and Ruby and there's very little chance that's going to change in a meaningful way in the near future.
 
It's never been easier to be a programmer. (Does not matter the language, for the most part). If you post on gaf before you find the answer on google, you're probably an artist. Rejoice.
 
As soon as I see bashing on C++ or java in this thread, I realise it is all fake programmer in here. Or just student who take couple of class on programming and claim they know the world.
 
Status
Not open for further replies.
Top Bottom