• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Programming |OT| C is better than C++! No, C++ is better than C

Jarmel

Banned
You mean the linux environment, system calls, or linux systems implementation?

lol I have no clue what's the difference between those three. I think it's the linux environment as we're doing introductory stuff with the terminal and text.

Edit:We are doing some basic system call stuff but the issue I'm having now is separate from that.
 

Two Words

Member
I am having a dispute with a professor's grading on an assignment. First I want to ask people how they would interpret the assignment.

The assignment says the program should ask for two strings from the user. The program will then state whether the second string is a substring of the first string. That's all the assignment says. From that, would you believe that the definition of substring for this problem is or is not case sensitive?

I assumed it is case sensitive because strings are nothing more than sequences of characters. A and a are not the same character. "apple" is not a substring of "Apples". The professor disagrees and believes her problem implied ignoring cases. She says I should have wrote that the user must enter all lower cases. I lost 40 % of the points this problem had on the homework due to this. I'm going to her office today to dispute this with her. How do you guys feel about this?
 

JesseZao

Member
I am having a dispute with a professor's grading on an assignment. First I want to ask people how they would interpret the assignment.

The assignment says the program should ask for two strings from the user. The program wil then state whether the second string is a substring of the first string. That's all the assignment says. From that, would you believe that the definition of substring for this problem is or is not case sensitive?

I assumed it is case sensitive because strings are nothing more than sequences of characters. A and a are not the same character. "apple" is not a substring of "Apples". The professor disagrees and believes her problem implied ignoring cases. She says I should have wrote that the user must enter all lower cases. I lost 40 % of the points this problem had on the homework due to this. I'm going to her office today to dispute this with her. How do you guys feel about this?

I mean, if she's having you assume things, that's not a very good precedent for a programming assignment. I would have interpreted it the same as you.
 

phoenixyz

Member
The assignment says the program should ask for two strings from the user. The program wil then state whether the second string is a substring of the first string. That's all the assignment says.

If that's literally all the information there is (so no other conventions in earlier tasks or something alike) I think you are in the right. For me substring means that the characters are equal and 'A' != 'a'. On top of that it's pretty ridiculous to subtract 40% of points for that.
 

Qurupeke

Member
I am having a dispute with a professor's grading on an assignment. First I want to ask people how they would interpret the assignment.

The assignment says the program should ask for two strings from the user. The program will then state whether the second string is a substring of the first string. That's all the assignment says. From that, would you believe that the definition of substring for this problem is or is not case sensitive?

I assumed it is case sensitive because strings are nothing more than sequences of characters. A and a are not the same character. "apple" is not a substring of "Apples". The professor disagrees and believes her problem implied ignoring cases. She says I should have wrote that the user must enter all lower cases. I lost 40 % of the points this problem had on the homework due to this. I'm going to her office today to dispute this with her. How do you guys feel about this?

I'm with you. I learned that a isn't the same as A in programming, unless stated otherwise. She should have specified that. Are there any other people that had the same problem? If more people go, it will be easier to persuade her that this is wrong.
 

Chris R

Member
I know it's naive to think possible, but I really don't want to see Asp.net web forms ever again. I don't understand why people don't use mvc. It's just so much more elegant and clean. Clean up your legacy debt!

Web Forms aren't that bad.

I tried to get my new project on MVC but the leads wouldn't go for it :(
 

Two Words

Member
I mean, if she's having you assume things, that's not a very good precedent for a programming assignment. I would have interpreted it the same as you.

If that's literally all the information there is (so no other conventions in earlier tasks or something alike) I think you are in the right. For me substring means that the characters are equal and 'A' != 'a'. On top of that it's pretty ridiculous to subtract 40% of points for that.

I'm with you. I learned that a isn't the same as A in programming, unless stated otherwise. She should have specified that. Are there any other people that had the same problem? If more people go, it will be easier to persuade her that this is wrong.
She ended up giving back most of the credit. She agreed with my points. I feel like she should have given me full credit back, but I didn't want to fight over 2 points. More importantly, she assured me that other ambiguous instructions won't be handled the same way in further assignments.
 

msv

Member
She ended up giving back most of the credit. She agreed with my points. I feel like she should have given me full credit back, but I didn't want to fight over 2 points. More importantly, she assured me that other ambiguous instructions won't be handled the same way in further assignments.
Case insensitivity needs to be indicated in most common uses of substring and string equals, so case sensitive is the default way of handling things. So in the absence of information case sensitivity is the accepted way.

I mean, which programming languages even have string comparers that are case insensitive by default?
 

vypek

Member
I am having a dispute with a professor's grading on an assignment. First I want to ask people how they would interpret the assignment.

The assignment says the program should ask for two strings from the user. The program will then state whether the second string is a substring of the first string. That's all the assignment says. From that, would you believe that the definition of substring for this problem is or is not case sensitive?

I assumed it is case sensitive because strings are nothing more than sequences of characters. A and a are not the same character. "apple" is not a substring of "Apples". The professor disagrees and believes her problem implied ignoring cases. She says I should have wrote that the user must enter all lower cases. I lost 40 % of the points this problem had on the homework due to this. I'm going to her office today to dispute this with her. How do you guys feel about this?

At first read I would assume that the project is case insensitive but you are definitely right. Glad you got most of your points back and that she will take care of the ambiguity from now on. Just wondering, did you students have to handle defining/finding a substring yourselves or could you use something pre-written?
 

Two Words

Member
At first read I would assume that the project is case insensitive but you are definitely right. Glad you got most of your points back and that she will take care of the ambiguity from now on. Just wondering, did you students have to handle defining/finding a substring yourselves or could you use something pre-written?
She was fine with us using predefined methods, but I did it by hand.
 

Mexen

Member
I hope someone can help me. I'm looking for information on how to integrate an expert system to an Android application.
 

Jokab

Member
Hey guys. I'm taking a course in requirements engineering, and we're specifying requirements for a project we have. We have ended up with a lot of requirements that are for creating, deleting and updating entities in the system, but it seems awfully redundant to repeat yourself for every different type of entity we have. The thing is that many entities, but not all, should follow CRUD exactly, so we can't really have update and delete follow implicitly from create for every entity. How do we handle this properly?
 

ty_hot

Member
Anyone tried the Coursera courses related to Android Programming? I think I will start one but it says I should have some previous Java knwoledge (which I dont). Should I give it a try ?
 
Anyone tried the Coursera courses related to Android Programming? I think I will start one but it says I should have some previous Java knwoledge (which I dont). Should I give it a try ?

If you know some basic OOP concepts like implementing interfaces etc.
Java should not be a road block to start android development.
Maybe some basic syntax knowledge would be preferred.
 
Hey guys. I'm taking a course in requirements engineering, and we're specifying requirements for a project we have. We have ended up with a lot of requirements that are for creating, deleting and updating entities in the system, but it seems awfully redundant to repeat yourself for every different type of entity we have. The thing is that many entities, but not all, should follow CRUD exactly, so we can't really have update and delete follow implicitly from create for every entity. How do we handle this properly?

Then Don't Repeat Yourself. There is a class waiting to be factored... or a class waiting to be composed in.
 
Exercise:

Write a program that asks the user for the number of:

1.humans.
2.dogs.
3.ants.
4.spiders.

Have the program output the average number of legs for all:

1.creatures.
2.mammals.
3.Insects.

Assume that

1.number of humans + number of dogs > 0
2.number of ants + number of spiders > 0

Note: Just in case you didn't know:Ants have 6 legs.Spiders have 8 legs.

Example:
How many humans are there? 2
How many dogs are there? 5
How many ants are there? 1
How many spiders are there? 1
The average number of legs for all creatures is 4.22222.
The average number of legs for all mammals is 3.42857.
The average number of legs for all insects is 7.


So here's my attempt..
Code:
#include <iostream>
using namespace std;

int main()
{
    double humans, dogs, ants, spiders;
    
    cout << "How many humans are there?";
    cin >> humans;
    
    cout << "How many dogs are there?";
    cin >> dogs;
    
    cout << "How many ants are there?";
    cin >> ants;
    
    cout << "How many spiders are there?";
    cin >> spiders;
    
    cout << "The average number of legs for all creatures is " << (humans*2+dogs*4+ants*6+spiders*8)%(humans+dogs+ants+spiders);
    cout << "The average number of legs for all mammals is " << (humans*2+dogs*4)%(humans+dogs);
    cout << "The average number of legs for all insects is " << (ants*6+spiders*8)%(ants+spiders);
    
    system("pause");
    return 0;
}

im getting this message: "invalid operands of types `double' and `double' to binary `operator%'"
 

arit

Member
Exercise:

Write a program that asks the user for the number of:

1.humans.
2.dogs.
3.ants.
4.spiders.

Have the program output the average number of legs for all:

1.creatures.
2.mammals.
3.Insects.

Assume that

1.number of humans + number of dogs > 0
2.number of ants + number of spiders > 0

Note: Just in case you didn't know:Ants have 6 legs.Spiders have 8 legs.

Example:
How many humans are there? 2
How many dogs are there? 5
How many ants are there? 1
How many spiders are there? 1
The average number of legs for all creatures is 4.22222.
The average number of legs for all mammals is 3.42857.
The average number of legs for all insects is 7.


So here's my attempt..
Code:
#include <iostream>
using namespace std;

int main()
{
    double humans, dogs, ants, spiders;
    
    cout << "How many humans are there?";
    cin >> humans;
    
    cout << "How many dogs are there?";
    cin >> dogs;
    
    cout << "How many ants are there?";
    cin >> ants;
    
    cout << "How many spiders are there?";
    cin >> spiders;
    
    cout << "The average number of legs for all creatures is " << (humans*2+dogs*4+ants*6+spiders*8)%(humans+dogs+ants+spiders);
    cout << "The average number of legs for all mammals is " << (humans*2+dogs*4)%(humans+dogs);
    cout << "The average number of legs for all insects is " << (ants*6+spiders*8)%(ants+spiders);
    
    system("pause");
    return 0;
}

im getting this message: "invalid operands of types `double' and `double' to binary `operator%'"

% is the modulo operator (remainder), not division /.
 

ty_hot

Member
Do you have programming knowledge in other languages? (if yes, which language(s)?)

A long time ago I learned very basic VB in high school, then in University they taught us Pascal (lol), basic stuff as well. So basically I don't know almost anything, but it was always quite easy for me to understand things and put them to work.

I started the course, 2 lectures (weeks) it basically talks about Android in general, some specific Android classes and stuff like that, no programming, so it is fine, good to get a sense of it. Somebody posted about one book for learning Java, I downloaded it and will try to learn by myself.

If you know some basic OOP concepts like implementing interfaces etc.
Java should not be a road block to start android development.
Maybe some basic syntax knowledge would be preferred.

I don't but I will read about that, thanks.
Syntax is also a good first step, but won't be difficult to get.
 
Hey everyone,

I have recently developed an interest in programming and would really love to get a better understanding of it. My ultimate goal is to learn how to create video games, and don't worry I understand that that will take thousands upon thousands of hours to learn and that there are a multitude of other factors involved. More the reason to get going right now! I am pretty much completely new to it, all I know how to do is very basic HTML and CSS. After launching Unity for the first time, I have absolutely no idea where to start as you may expect. What would you all recommend for my situation? Should I learn Java or C#? Or both? Where are the best places for a complete beginner to start learning these languages? I'm assuming I have hundreds of hours of learning to go before I should even attempt anything in Unity, but I'm curious if there are actually some benefits of messing around with it as I start learning some code.

I'm really looking forward to getting some feedback. Please include as many links to tutorials and what not as you want! I'm unemployed for the next couple weeks so I want to get right into this!
 

Makai

Member
Hey everyone,

I have recently developed an interest in programming and would really love to get a better understanding of it. My ultimate goal is to learn how to create video games, and don't worry I understand that that will take thousands upon thousands of hours to learn and that there are a multitude of other factors involved. More the reason to get going right now! I am pretty much completely new to it, all I know how to do is very basic HTML and CSS. After launching Unity for the first time, I have absolutely no idea where to start as you may expect. What would you all recommend for my situation? Should I learn Java or C#? Or both? Where are the best places for a complete beginner to start learning these languages? I'm assuming I have hundreds of hours of learning to go before I should even attempt anything in Unity, but I'm curious if there are actually some benefits of messing around with it as I start learning some code.

I'm really looking forward to getting some feedback. Please include as many links to tutorials and what not as you want! I'm unemployed for the next couple weeks so I want to get right into this!
C#, definitely. Download the free version of Visual Studio. Make some simple command line applications, first. Find a Comp Sci 101 textbook and read it. Dick around in Unity and try to apply the lessons from the book into your personal projects. You will have rapid progress in your first year.
 
C#, definitely. Download the free version of Visual Studio. Make some simple command line applications, first. Find a Comp Sci 101 textbook and read it. Dick around in Unity and try to apply the lessons from the book into your personal projects. You will have rapid progress in your first year.

Thanks! I'll start that download now.
 
A long time ago I learned very basic VB in high school, then in University they taught us Pascal (lol), basic stuff as well. So basically I don't know almost anything, but it was always quite easy for me to understand things and put them to work.

I started the course, 2 lectures (weeks) it basically talks about Android in general, some specific Android classes and stuff like that, no programming, so it is fine, good to get a sense of it. Somebody posted about one book for learning Java, I downloaded it and will try to learn by myself.



I don't but I will read about that, thanks.
Syntax is also a good first step, but won't be difficult to get.

Java is not a very complicated language (at least compared to something like C++), so you should be fine if you read the Java book while doing the Android class.
 

MrCuddle

Member
Hey everyone,

I have recently developed an interest in programming and would really love to get a better understanding of it. My ultimate goal is to learn how to create video games, and don't worry I understand that that will take thousands upon thousands of hours to learn and that there are a multitude of other factors involved. More the reason to get going right now! I am pretty much completely new to it, all I know how to do is very basic HTML and CSS. After launching Unity for the first time, I have absolutely no idea where to start as you may expect. What would you all recommend for my situation? Should I learn Java or C#? Or both? Where are the best places for a complete beginner to start learning these languages? I'm assuming I have hundreds of hours of learning to go before I should even attempt anything in Unity, but I'm curious if there are actually some benefits of messing around with it as I start learning some code.

I'm really looking forward to getting some feedback. Please include as many links to tutorials and what not as you want! I'm unemployed for the next couple weeks so I want to get right into this!

https://channel9.msdn.com/Series/C-Sharp-Fundamentals-Development-for-Absolute-Beginners is a great place for learning the fundamentals of C#.
 
Actually pursuing some of my goals and enrolling in some Android development classes. I had a Linux project last semester and hated it. Made me discouraged as a programmer but I got over it and now I am working twice as hard.
 
Quick question about some Object Oriented programming.

So in most OO languages you can use dot notation to access the attributes of an object.

For example

Code:
Variable = Object.attribute

However this method seems to be frowned upon and instead you are supposed to use "getter" and "setter" methods like this

Code:
Variable = Object.getAttribute()

Variable = Object.setAttribute(Value)

My question is: Why?

What I mean is why is making separate methods the "correct" way to do things and simply referencing with dot notation the "wrong" way?

What kind of complications can arise from using dot notation instead of get/set methods?
 

JesseZao

Member
Quick question about some Object Oriented programming.

So in most OO languages you can use dot notation to access the attributes of an object.

For example

Code:
Variable = Object.attribute

However this method seems to be frowned upon and instead you are supposed to use "getter" and "setter" methods like this

Code:
Variable = Object.getAttribute()

Variable = Object.setAttribute(Value)

My question is: Why?

What I mean is why is making separate methods the "correct" way to do things and simply referencing with dot notation the "wrong" way?

What kind of complications can arise from using dot notation instead of get/set methods?

It's called Encapsulation.
 

Qurupeke

Member
Quick question about some Object Oriented programming.

So in most OO languages you can use dot notation to access the attributes of an object.

For example

Code:
Variable = Object.attribute

However this method seems to be frowned upon and instead you are supposed to use "getter" and "setter" methods like this

Code:
Variable = Object.getAttribute()

Variable = Object.setAttribute(Value)

My question is: Why?

What I mean is why is making separate methods the "correct" way to do things and simply referencing with dot notation the "wrong" way?

What kind of complications can arise from using dot notation instead of get/set methods?

If I recall correctly, at least in C++, the attributes are usually private. You can't get them without the getter. This is part of the mindset of OO programming. You have an object, with certain attributes that you can't change directly. You have an object that only does some specific "actions". Complications can happen because you change attributes that are meant to not be changed, thus they're private. I guess this can happen when many people handle a program but I'm new to this, so I'm not sure if this is the (only) case. Security reasons, generally. I hope I at least partially answered your question. :p
 

upandaway

Member
Also some languages have ways of implementing getters and setters into the object.attribute syntax, so even if you see some code like that it might not necessarily be breaking encapsulation.
 

Snow

Member
Quick question about some Object Oriented programming.

So in most OO languages you can use dot notation to access the attributes of an object.

For example

Code:
Variable = Object.attribute

However this method seems to be frowned upon and instead you are supposed to use "getter" and "setter" methods like this

Code:
Variable = Object.getAttribute()

Variable = Object.setAttribute(Value)

My question is: Why?

What I mean is why is making separate methods the "correct" way to do things and simply referencing with dot notation the "wrong" way?

What kind of complications can arise from using dot notation instead of get/set methods?

So the basic idea is that anything that can be accessed through object.<something> is a part of an object's public interface. This is something you want to remain fairly static, because every time you change this public interface it will involve updating all the code that is using this public interface. By directly calling the internal data you're tying your public interface to the internal data representation. This means that it becomes much harder to change said representation or that your public interface is going to change more often than you possibly want.

Having getters and setters there allow you to wedge some code in between that allow you to change the internal representation but still present the same interface to the outside world. And you can add some code in the getters and setters that translate the new internal data representation to the old one that the outside world is expecting.

Having said all that, I do find getters and setters kind of ugly and distasteful. First of all, having pure data objects that you're throwing around can be totally reasonable in many cases. E.g. I do a lot of 3d programming and I'm not about to add getters and setters to my 3d vector class. And secondly, getters and setters are kind of a kludgy way to do what properties do way better. E.g. in python you may have a class with elements 'a' and 'b', accessed by object.a and object.b. If you decide that 'a' needs to be store in a different way you can do that, add a method called 'a' and decorate it with an '@property'. The 'a' element can be accessed using the exact same object.a syntax, allowing to keep the interface the same and seamlessly move between the versions of 'a'.
 

Makai

Member
I've worked on a large application where the people before me made every field public and interacted with them as they pleased. It was not pretty. Productivity was basically zero because any change you made was guaranteed to break something else. Situations like this can be avoided by restricting access early on.
 
I just started a job working at a serious games company on monday.
Job sounded real good on paper, with a focus on developing native mobile and mobile web apps and the people are really great and I really like the atmosphere, but god, programming in JavaScript is such a damn hassle! :/ When is wasm coming again? I can't wait to replace this god awful language, atleast on the client-side, I've heard good things from Node.js.
 

Somnid

Member
I just started a job working at a serious games company on monday.
Job sounded real good on paper, with a focus on developing native mobile and mobile web apps and the people are really great and I really like the atmosphere, but god, programming in JavaScript is such a damn hassle! :/ When is wasm coming again? I can't wait to replace this god awful language, atleast on the client-side, I've heard good things from Node.js.

Javascript can be a rough transition especially if you come from traditional languages like Java or C++ as you'll have to throw out some of what you had previously thought was best practice and use some different patterns. Though I imagine it'll grow on you over time, certainly I felt the same way when I started but I really like it now.
 

OceanBlue

Member
Page long methods are terrifying.

I think I actually saw the stream where he explained it. IIRC he said that, for methods in which you don't expect any part of it to be reused, it doesn't make sense to split long methods into subprocedures because it just adds more things to remember.
 
I think I actually saw the stream where he explained it. IIRC he said that, for methods in which you don't expect any part of it to be reused, it doesn't make sense to split long methods into subprocedures because it just adds more things to remember.
I know his justification for it, but I still disagree because splitting a method into smaller subprocedures is also about comprehensibility. If the pieces don't need to be reused and they'll clutter up the namespace, they should be definitions that are local to the function. In fact I would argue that not having little functions everywhere building up to larger pieces of functionality leads to a situation where one goes back and tries to fix a definition (inevitably) but during the process of trying to comprehend the function one ends up having to keep a lot more in his head than would have been necessary.
 

Makai

Member
I think I actually saw the stream where he explained it. IIRC he said that, for methods in which you don't expect any part of it to be reused, it doesn't make sense to split long methods into subprocedures because it just adds more things to remember.
That's basically his argument.

https://youtu.be/JjDsP5n2kSM

But he takes it a step further and thinks you should do straight procedures even if you think code will be reused.
 

OceanBlue

Member
I know his justification for it, but I still disagree because splitting a method into smaller subprocedures is also about comprehensibility. If the pieces don't need to be reused and they'll clutter up the namespace, they should be definitions that are local to the function. In fact I would argue that not having little functions everywhere building up to larger pieces of functionality leads to a situation where one goes back and tries to fix a definition (inevitably) but during the process of trying to comprehend the function one ends up having to keep a lot more in his head than would have been necessary.

I lean more towards what you're saying too also because, in my (very limited) programming experience, you won't always be using the most appropriate design or understand what functionality you might need where when you start on a project. I find that, if I at least try to think about how to break apart functions into very precise functionality, it's easier to pick apart spots that might need refactoring later on when I review the code or need the same functionality for something else.

Although I'm an awful violator of YAGNI right now :(. I dunno lol.

Edit: That said, I'm currently working on an application with relatively simple business logic. I don't have experience with compilers or anything that involves elaborate business code so I don't know how it goes in different types of software.
 
Page long methods are terrifying.

True story: the application I currently work on had a ~3000 line method when I first joined several years ago, and several others with well over 1000 lines. I won't tell you whether or not a method that had a 1500+ line loop body still exists. I won't tell you that.

It does.

I think I actually saw the stream where he explained it. IIRC he said that, for methods in which you don't expect any part of it to be reused, it doesn't make sense to split long methods into subprocedures because it just adds more things to remember.

Forget that. Maintenance on long methods is a nightmare. Break those things apart and name the smaller functions well. You shouldn't need to "remember" it all. The methods should be small and the names should tell you most everything you need to know.

The problem with the application I support is 98% due to its horrid design and 2% due to the natural complexity of the business. And despite working on the application for years, much of the original problems remain (case in point, the 1500+ line loop) because new functionality seems to always get prioritized above dealing with the technical debt of bad, hard to read, harder to maintain code.
 

JeTmAn81

Member
Even if there's no reuse it's best to break down large methods into submethods to aid in readability and testing.

I just started a job working at a serious games company on monday.
Job sounded real good on paper, with a focus on developing native mobile and mobile web apps and the people are really great and I really like the atmosphere, but god, programming in JavaScript is such a damn hassle! :/ When is wasm coming again? I can't wait to replace this god awful language, atleast on the client-side, I've heard good things from Node.js.

What don't you like about JS? I wouldn't count on it going away anytime soon. It's only getting more popular.
 

OceanBlue

Member
The problem with the application I support is 98% due to its horrid design and 2% due to the natural complexity of the business. And despite working on the application for years, much of the original problems remain (case in point, the 1500+ line loop) because new functionality seems to always get prioritized above dealing with the technical debt of bad, hard to read, harder to maintain code.

Is this common? I feel like this is probably the case with the company I work at too, or at least the team I am on.

As an aside, do companies usually attempt to invest in teaching developers software design, or do most developers pick it up while working on software if at all? I've been working for almost a year and I feel like I've only improved because the software I'm maintaining is painful to maintain and I like reading programming books and listening to talks, so I can contextualize my painful maintenance experiences with what I'm learning outside of work. Does learning design even matter that much? Lol, am I just confusing what's important because of my lack of experience?
 
Top Bottom