• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Programming |OT| C is better than C++! No, C++ is better than C

Hey guys, looking for advice on a good book to learn Java.

I used to program in Java in college years ago so I do understand the basic gist but it's been a long time and now I think I need to get back into it. I can program in Python and a little Javascript too.

I have purchased paperback "The Pragmatic Programmer" and "Growing Object-Oriented Software Guided by Tests" from Amazon so I'll have them I think by the middle of next month.

I have "Clean Code" by Robert C Martin in PDF format and also Gang of Four Design Patterns in PDF.

So far I haven't read any of these yet so I'll spend the next year or so going through them.

Can anyone recommend a good general book for Java or I suppose what is generally considered the "best" Java book? I suppose I'm sort of a beginner because I haven't used it in a long time and only used it to a basic level.
Besides Head First Java you probably want another book, as HFJ only covers Java 5. One of these will probably be fine:

Building Java Programs - a back to basic approach

Either Core Java (both Vol I and Vol II) or Core Java for the Impatient. The second one is basically a condensed version of Core Java Vol I & II.




After you passed the beginner stage take a look at some of these:

Effective Java (2th edition)

Java Concurrency in Practice

Well Grounded Java Developer

Java 8 in Action: Lambdas, Streams, and functional-style programming


This post is so jaded, haha.

I would call it realistic myself. ;)
 

Koren

Member
My two recommendations are windows specific, but they are here:

https://www.amazon.com/dp/0735662789/?tag=neogaf0e-20

https://www.amazon.com/dp/0321374460/?tag=neogaf0e-20

The second is more advanced (and also better, for that reason). There's a number of books that focus on other debuggers and other platforms, but I haven't read them. I think most of the techniques (especially in the second book) are applicable on other platforms if you have a good enough knowledge of the debugger's command set
Many thanks... I'll at least look at the points discussed there. I've never thought there would be enough material to write a book on debuggers.
 

Somnid

Member
Can anyone explain the hate for Java?

The only real issues I've found with it are GC at scale and verbosity.

Because it doesn't do anything better than other languages. Like usually languages have some sort of hook, something that they are designed to handle whether that be expressibility of certain concepts, approachability to novices etc. For Java it was the JVM which allowed you to run the same code on different machines, in 2016 plenty of languages have multi platform runtimes and they are faster than Java. But Java is also very slow evolving, it lacks many ergonomic features that have been added to contemporary languages and the tools used are still fairly primitive when you stack it up to things like Visual Studio. And finally while virtually all usable languages are open-source and many have nice developer communities, Java still has Sun which seems to want to litigate people for using Java APIs.
 
Can anyone explain the hate for Java?

The only real issues I've found with it are GC at scale and verbosity.

That's enough downsides for most people. Boilerplate code is another common complaint.
I tend to agree that these days you should learn java only if you have to (for a job or for school); there are a lot of good alternatives for whatever you might want to do.
You don't even need it for android programming most o the time if you use xamarin.
 
Because it doesn't do anything better than other languages. Like usually languages have some sort of hook, something that they are designed to handle whether that be expressibility of certain concepts, approachability to novices etc. For Java it was the JVM which allowed you to run the same code on different machines, in 2016 plenty of languages have multi platform runtimes and they are faster than Java. But Java is also very slow evolving, it lacks many ergonomic features that have been added to contemporary languages and the tools used are still fairly primitive when you stack it up to things like Visual Studio. And finally while virtually all usable languages are open-source and many have nice developer communities, Java still has Sun which seems to want to litigate people for using Java APIs.

While all this is true, Java has quite an amazing ecosystem around it. While some other languages have it beaten in specific niches (Python is the obvious choice for scientific computing for example), Java can be used for pretty much anything if you want to.
 

poweld

Member
For Java it was the JVM which allowed you to run the same code on different machines, in 2016 plenty of languages have multi platform runtimes and they are faster than Java. But Java is also very slow evolving, it lacks many ergonomic features that have been added to contemporary languages and the tools used are still fairly primitive when you stack it up to things like Visual Studio. And finally while virtually all usable languages are open-source and many have nice developer communities, Java still has Sun which seems to want to litigate people for using Java APIs.
https://en.wikipedia.org/wiki/Java_performance#Comparison_to_other_languages
The JVM is pretty quick, not sure which VM you're referring to that beats it by any significant margin.

But Java is also very slow evolving, it lacks many ergonomic features that have been added to contemporary languages
Hard to know what you mean by ergonomic features, but Java 1.8 introduced some features similar to Scala that make working in it a bit more elegant.

and the tools used are still fairly primitive when you stack it up to things like Visual Studio.
IntelliJ IDEA is quite an impressive and feature-rich IDE. It's also the same as the core for Android Development Studio.

And finally while virtually all usable languages are open-source and many have nice developer communities, Java still has Sun which seems to want to litigate people for using Java APIs.
First, there's http://openjdk.java.net/
This isn't really a point against Java. Also, the development community for it is great. There is a library for everything under the sun, as well as lots of documentation and queries answered.

I think GC is tricky if you're running high performance applications, but it's also manageable with some tweaking. It's also not the only language that uses GC, and in doing so these languages do allow developers to make fewer mistakes and develop more quickly. It seems to me like a fairly reasonable tradeoff. Though a JVM without stop-the-world GC would certainly be welcome :)

As for the verbosity, it rarely becomes such an issue when using an IDE. Also, by comparison, C++ is just about as verbose as Java unless you use typedefs as shortcuts. You could attain a similar effect by encapsulating painfully verbose parts of your code, too.

I'm not here to be Java's defense force, I just find it wearisome that many people bag on a select few languages without making a valid case against them.
 

Somnid

Member
While all this is true, Java has quite an amazing ecosystem around it. While some other languages have it beaten in specific niches (Python is the obvious choice for scientific computing for example), Java can be used for pretty much anything if you want to.

Plenty of languages have good ecosystems to the point where this again isn't better than anything else. Python does, C# does, JS/Node is like an order of magnitude larger than anyone, even fledgling languages like Swift and Rust have quickly built fairly robust ecosystems and will almost certainly outpace Java within a few years.

I'm not here to be Java's defense force, I just find it wearisome that many people bag on a select few languages without making a valid case against them.

You could pretty much swap C# in any situation you use Java and be happier. Again it's not to say Java is like PHP level terrible but there's simply better.
 

Somnid

Member
Why would I be happier? I'm open minded and curious. I haven't used C#, so I'm not familiar with the novel features it offers.

Syntactically it's very similar but the language itself is just nicer and filled with a lot of useful sugar (Linq, nullable types, generators, async await). Visual Studio is the best IDE, period and natively supports C#. JetBrains also makes an awesome plugin for Visual Studio called ReSharper which if you love their Java stuff you'll love it too. The release cadence for new features is also faster.

It used to be a harder sell because it was Microsoft siloed but now that's it's open source and runs on Linux it should easily fill the niche that Java used to have there.
 
Many thanks... I'll at least look at the points discussed there. I've never thought there would be enough material to write a book on debuggers.

Part of it is just how well do you understand the operating system? If you have a weird crash in the middle of nowhere, how do you trace that back to a heap corruption? Nowadays the answer might be "run it under ASAN", but what if it's in a core dump? There's definitely an art to debugging
 

poweld

Member
Syntactically it's very similar but the language itself is just nicer and filled with a lot of useful sugar (Linq, nullable types, generators, async await). Visual Studio is the best IDE, period and natively supports C#. JetBrains also makes an awesome plugin for Visual Studio called ReSharper which if you love their Java stuff you'll love it too. The release cadence for new features is also faster.

It used to be a harder sell because it was Microsoft siloed but now that's it's open source and runs on Linux it should easily fill the niche that Java used to have there.

Linq: This seems like a JDBC equivalent, correct me if I'm mistaken.

Nullable types: Java 1.8 Introduced the Optional type: https://docs.oracle.com/javase/8/docs/api/java/util/Optional.html

Generators: Java 1.8 introduced the Supplier interface: https://docs.oracle.com/javase/8/docs/api/java/util/function/Supplier.html

Async wait: Java 1.8 introduced the CompletableFuture type: https://docs.oracle.com/javase/8/docs/api/java/util/concurrent/CompletableFuture.html

I'm sure that C# is nice to work in, but I'm still not entirely convinced why a person should switch to it. More importantly, suggesting that new developers eschew such a ubiquitous and competent language is misleading.
 
Linq is nothing like jdbc. It lets you write sql queries for arbitrary data structures. Like if you have an array of Foo's, you can return a generator for all those where Foo.x == 7. You can use this to do joins across containers, just like sql.

C# also has a vastly superior generics model, and recently has gotten a really, really good concurrency model with the async and wait model that runs circles around everything else out there.

Reflection in c# is way better than in Java too (at least last time i used Java)

The tools are also miles ahead of Java tools
 

Gurrry

Member
Taking my first ever computer science class today. Its gonna be my minor. Im a little worried. I mainly code games with C# and Javascript (self taught), but this class is apparently about data types/python. Not only that, its a 3 hour class... bleh.

I really hope I can find computer science classes that will help me with my main goal of making games. Or atleast help me understand coding better and how to do things more efficiently.

I also really hope i didnt make a horrible mistake making this my minor. I think I can do it, but I really hope I dont get in there and its too late to realize I bit off more than I can chew.
 
Java has GC at scale problems? I thought Hotspot was incremental? And someone has some kind of fancy concurrent GC for it, too.

Also, cppking is right on the money about C#. I didn't appreciate attributes+reflection until I saw what wizards can do with magic.
 
I can't wrap my head around this MASM assembly code...

Would anyone mind answering/explaining my bolded questions?

Code:
readVal PROC
	push	ebp
	mov		ebp, esp
	pushad	;push all registers to save

startAgain:
	mov		edx, [ebp+12]	;@address of buffer byte array
	mov		ecx, [ebp+8]	;size of buffer byte array into ecx

;read the input
	mGetString	edx, ecx	;invoke the getString macro to read in the users string

;set up registers for conversion
	mov		esi, edx		;move the address of buffer into esi
	mov		eax, 0
	mov		ecx, 0
	mov		ebx, 10

;load the string in byte by byte
ByteByByte:
	lodsb					;loads from memory at esi
	cmp		ax, 0			;check if we have reached the end of the string
	je		finished		;if so jump to finish

;check the range to make sure char is a int in ascii
	cmp		ax, 48	;0 is at 48
	jb		error
	cmp		ax, 57	;9 is at 57
	ja		error

;adjust for value of digit
	sub		ax, 48			;-48 for value of digit
[B]1.	xchg	eax, ecx		;place character value in ecx[/B]
	mul		ebx				;multiply by 10 for correct digit place
	jc		error			;jmp if carry flag is set meaning overflow pg. 191 textbook
	jnc		errorFree		;jmp to errorFree if no carry flag is set pg. 191 textbook

error:
	mDisplayString	errorMessage
	jmp startAgain			;start over bc invalid input

errorFree:
	add		eax, ecx		;add the digit (in correct spot) to the total of int
	xchg	eax, ecx		;exchange for the next loop through
	jmp		ByteByByte		;examine next byte
	
finished:
	xchg	ecx, eax
[B]2. [/B]	[B]mov		DWORD PTR buffer, eax	;move eax to the pointer to save the int in passed [/B]variable

	popad		;restore all registers
	pop ebp
	ret 8
readVal ENDP

1. Why is there an exchange call there? 0 was placed in ecx. So wouldn't the following mul ebx simply result in 0? Instead of multiplying the value that was in eax by 10?

2. Why is DWORD PTR buffer used here? Earlier on in this code the author had typed " mov esi, edx ;move the address of buffer into esi"

so the address of the buffer should be in esi, right? Why then use buffer? I've tried subbing this line out with DWORD PTR [esi], or [esi] but that isn't moving the value in eax into the buffer.
 
But I mean, it's not like C# is unemployable, or even hard to find a job in. It's #4 on that same list. Even with C++, which is #6 on that list, I could find a job within a week if I needed to.

I also expect C# to gain in popularity over the next couple years.
 

Kalnos

Banned
I'm not arguing enjoyment, I'm arguing employability and usage.

Well, you said learning Java over C# isn't such a clear case yet C# is directly behind it in on your list meaning it's still extremely prevalent. Anecdotally you will have no problem getting a job using either language if you're slightly competent so why settle for Java if you don't like it?
 

Somnid

Member
Then again, Java seems to be the most popular language in the world. According to the same site, it was the language that saw the highest increase in rating in 2015. Furthermore, it seems to be the second most employable language after SQL. Despite its shortcomings, I don't think saying that learning C# over Java is such a clear-cut case.

The metric seems to be most available positions, not popularity. But note that the quality of a language isn't really related to employability. I wouldn't set out to be making 150K using Rust just yet and there are certainly lots of legacy apps that will fetch top dollar because because nobody else can deal with them but they need maintenance. So when I say switch to C# might make people happier I'm not saying "all Java devs should switch careers" but rather as a business decision if I was granted the opportunity to pick a technology, I'd use C# over Java most of the time. And I say it as an interesting language to check out if you spend lots of time in Java. Any worthwhile dev is polyglot.
 

poweld

Member
It's clear cut if you replace "employable" with "enjoyable"

It's hard to rebut the argument "I don't like this". I just want people to try, or at least consider things before they adhere to a forum consensus as dogma.

For a bunch of intelligent people, developers seem to enjoy demonizing languages. Why is that? Demonizing is myopic and misleading, especially for people that haven't gotten a chance to try things out. If someone asked me if they should write a distributed system in Brainfuck, I'd tell them it's probably not the right tool for the job, since it would be difficult and doesn't have the library support needed to get it done in a reasonable amount of time. But should I dissuade them from learning the language? Of course not.

Every language I've used in my career has some quality or qualities that shine. If I want to quickly write a script, especially if parsing text, I'll use Perl. Embedded and/or high performance needs? C/++. Easily deployable and performant application that can rely on decades of libraries? Scala or Java.

I'm not saying that these are the only languages that can fit into these niches. They're just the ones that I am most familiar with. If you told me you're more comfortable scripting with Ruby or Python, who am I to tell you that is wrong?

I implore you to reconsider responding to the question, "How can I solve X with language Y" with "Don't".

Java has GC at scale problems? I thought Hotspot was incremental? And someone has some kind of fancy concurrent GC for it, too.
It is incremental, but a system whose GC settings are not tuned correctly can still end up spending an inordinate amount of time performing GC. Consider an application that regularly receives large transient blobs of data. If your GC strategy cannot handle a message of that size, it will go straight into your tenured working set which is only cleaned during a mixed GC event.

So when I say switch to C# might make people happier I'm not saying "all Java devs should switch careers" but rather as a business decision if I was granted the opportunity to pick a technology, I'd use C# over Java most of the time.

I can respect this standpoint.
 
Thanks for all the advice on books and Java in general.

I'll look into all of those books over the next year or two to get up to speed.

As for why I'm learning Java: I'm already familiar with the very basics of the language so won't take too long to get up to speed and in my job I currently work support and we use Java as our main language so I want to transition into the development team eventually and I've made this know and spoken to them about it so it will eventually be on the cards at some stage in the future.

I'm also going back to college in September and Java is thought there so makes sense to get into it from that perspective.

In regards to C# it doesn't seem a million miles away from Java so if I end up leaving this job or having to work with C# I imagine I could get to grips with it fairly quickly considering the familiar syntax/concepts it uses.
 
It is incremental, but a system whose GC settings are not tuned correctly can still end up spending an inordinate amount of time performing GC. Consider an application that regularly receives large transient blobs of data. If your GC strategy cannot handle a message of that size, it will go straight into your tenured working set which is only cleaned during a mixed GC event.
Gotcha. Care and caution. In that case I would either move the blob off heap, or I would increase heap size to be more appropriate for the expected working set.
 
I can't wrap my head around this MASM assembly code...

Would anyone mind answering/explaining my bolded questions?

Code:
readVal PROC
	push	ebp
	mov		ebp, esp
	pushad	;push all registers to save

startAgain:
	mov		edx, [ebp+12]	;@address of buffer byte array
	mov		ecx, [ebp+8]	;size of buffer byte array into ecx

;read the input
	mGetString	edx, ecx	;invoke the getString macro to read in the users string

;set up registers for conversion
	mov		esi, edx		;move the address of buffer into esi
	mov		eax, 0
	mov		ecx, 0
	mov		ebx, 10

;load the string in byte by byte
ByteByByte:
	lodsb					;loads from memory at esi
	cmp		ax, 0			;check if we have reached the end of the string
	je		finished		;if so jump to finish

;check the range to make sure char is a int in ascii
	cmp		ax, 48	;0 is at 48
	jb		error
	cmp		ax, 57	;9 is at 57
	ja		error

;adjust for value of digit
	sub		ax, 48			;-48 for value of digit
[B]1.	xchg	eax, ecx		;place character value in ecx[/B]
	mul		ebx				;multiply by 10 for correct digit place
	jc		error			;jmp if carry flag is set meaning overflow pg. 191 textbook
	jnc		errorFree		;jmp to errorFree if no carry flag is set pg. 191 textbook

error:
	mDisplayString	errorMessage
	jmp startAgain			;start over bc invalid input

errorFree:
	add		eax, ecx		;add the digit (in correct spot) to the total of int
	xchg	eax, ecx		;exchange for the next loop through
[B]3. [/B]	jmp		ByteByByte		;examine next byte
	
finished:
	xchg	ecx, eax
[B]2. [/B]	[B]mov		DWORD PTR buffer, eax	;move eax to the pointer to save the int in passed [/B]variable

	popad		;restore all registers
	pop ebp
	ret 8
readVal ENDP

1. Why is there an exchange call there? 0 was placed in ecx. So wouldn't the following mul ebx simply result in 0? Instead of multiplying the value that was in eax by 10?
Yes. But note the line I have marked 3 above. It unconditionally jumps back above. So the next iteration through this loop, ECX will not be 0.

2. Why is DWORD PTR buffer used here? Earlier on in this code the author had typed " mov esi, edx ;move the address of buffer into esi"

so the address of the buffer should be in esi, right? Why then use buffer? I've tried subbing this line out with DWORD PTR [esi], or [esi] but that isn't moving the value in eax into the buffer.

Pay attention to the semantics of the LODSB instruction.

http://www.intel.com/content/dam/ww...r-instruction-set-reference-manual-325383.pdf

Look on page 591.

The no-operands form provides “short forms” of the byte, word, and doubleword versions of the LODS instructions.
Here also DS:(E)SI is assumed to be the source operand and the AL, AX, or EAX register is assumed to be the destination
operand. The size of the source and destination operands is selected with the mnemonic: LODSB (byte
loaded into register AL), LODSW (word loaded into AX), or LODSD (doubleword loaded into EAX).
After the byte, word, or doubleword is transferred from the memory location into the AL, AX, or EAX register, the
(E)SI register is incremented or decremented automatically according to the setting of the DF flag in the EFLAGS
register. (If the DF flag is 0, the (E)SI register is incremented; if the DF flag is 1, the ESI register is decremented.)
The (E)SI register is incremented or decremented by 1 for byte operations, by 2 for word operations, or by 4 for
doubleword operations

So after LODSB executes, ESI will not contain the address of the buffer anymore. If you imagine a loop like this:

Code:
while (*buffer != 0) {
   // do something
   buffer++;
}

You can think of ESI as containing the value of "buffer" in this example. It increases each time through the loop.
 
It's hard to rebut the argument "I don't like this". I just want people to try, or at least consider things before they adhere to a forum consensus as dogma.

For a bunch of intelligent people, developers seem to enjoy demonizing languages. Why is that? Demonizing is myopic and misleading, especially for people that haven't gotten a chance to try things out. If someone asked me if they should write a distributed system in Brainfuck, I'd tell them it's probably not the right tool for the job, since it would be difficult and doesn't have the library support needed to get it done in a reasonable amount of time. But should I dissuade them from learning the language? Of course not.

Every language I've used in my career has some quality or qualities that shine. If I want to quickly write a script, especially if parsing text, I'll use Perl. Embedded and/or high performance needs? C/++. Easily deployable and performant application that can rely on decades of libraries? Scala or Java.

I'm not saying that these are the only languages that can fit into these niches. They're just the ones that I am most familiar with. If you told me you're more comfortable scripting with Ruby or Python, who am I to tell you that is wrong?

I implore you to reconsider responding to the question, "How can I solve X with language Y" with "Don't".

Certainly learn both languages. It's just that everyone I know who has learned both languages (including myself) overwhelmingly prefer C#.
 
Yes. But note the line I have marked 3 above. It unconditionally jumps back above. So the next iteration through this loop, ECX will not be 0.



Pay attention to the semantics of the LODSB instruction.

http://www.intel.com/content/dam/ww...r-instruction-set-reference-manual-325383.pdf

Look on page 591.



So after LODSB executes, ESI will not contain the address of the buffer anymore. If you imagine a loop like this:

Code:
while (*buffer != 0) {
   // do something
   buffer++;
}

You can think of ESI as containing the value of "buffer" in this example. It increases each time through the loop.

Ah thanks a ton.
 

Kieli

Member
Could I also ask for some course advice? I'm trying to trim my coursework, and I will need to drop a few courses.

Here's some that I'm considering: a course in optimization, distributed systems, parallelization and concurrency, advanced OS architecture. Currently, I'm leaning towards the first three, and dropping the fourth.

I don't really know how useful any of these courses are. I'm choosing parallelization because multi-cores are important, but I don't even know if I will ever need to write software using multiple cores (or if I'm even smart enough to do that). I keep seeing distributed systems popping up in job postings, so I'll take it even though I don't really know what it is (something about networks of computers doing a task). I'm dropping advanced OS architecture because I highly doubt I'll need to know that much detail about Windows/Linux/whatever.

Edit: Forgot to mention our school has a stream for machine learning. I was initially interested; however, speaking with employers, they are essentially hiring PhDs in machine learning and MSc in Data Science for anything related to data mining. So for me to waste precious courses on ML at an undergrad level is a complete waste of time.
 
Could I also ask for some course advice? I'm trying to trim my coursework, and I will need to drop a few courses.

Here's some that I'm considering: a course in optimization, distributed systems, parallelization and concurrency, advanced OS architecture. Currently, I'm leaning towards the first three, and dropping the fourth.

I don't really know how useful any of these courses are. I'm choosing parallelization because multi-cores are important, but I don't even know if I will ever need to write software using multiple cores (or if I'm even smart enough to do that). I keep seeing distributed systems popping up in job postings, so I'll take it even though I don't really know what it is (something about networks of computers doing a task). I'm dropping advanced OS architecture because I highly doubt I'll need to know that much detail about Windows/Linux/whatever.
Parallelization will be a very complementary class to distributed systems. Drop the first or the last. If you take the last course, it will set you up very nicely for a career in operations. Potentially.
 
Could I also ask for some course advice? I'm trying to trim my coursework, and I will need to drop a few courses.

Here's some that I'm considering: a course in optimization, distributed systems, parallelization and concurrency, advanced OS architecture. Currently, I'm leaning towards the first three, and dropping the fourth.

I don't really know how useful any of these courses are. I'm choosing parallelization because multi-cores are important, but I don't even know if I will ever need to write software using multiple cores (or if I'm even smart enough to do that). I keep seeing distributed systems popping up in job postings, so I'll take it even though I don't really know what it is (something about networks of computers doing a task). I'm dropping advanced OS architecture because I highly doubt I'll need to know that much detail about Windows/Linux/whatever.

Edit: Forgot to mention our school has a stream for machine learning. I was initially interested; however, speaking with employers, they are essentially hiring PhDs in machine learning and MSc in Data Science for anything related to data mining. So for me to waste precious courses on ML at an undergrad level is a complete waste of time.

Can you paste the course summary for optimization and parallel programming?

If the parallel programming course is going to focus on OpenMP or something, then it's a waste of time.
 

Jokab

Member
Well, you said learning Java over C# isn't such a clear case yet C# is directly behind it in on your list meaning it's still extremely prevalent. Anecdotally you will have no problem getting a job using either language if you're slightly competent so why settle for Java if you don't like it?

I'm not saying learn Java just to get employed if you don't like it. But it seemed like some people were arguing don't learn Java because it's not fun and/or C# is better. I pointed out that Java is more popular across the programming landscape and is more employable, albeit not by a large margin.

The metric seems to be most available positions, not popularity. But note that the quality of a language isn't really related to employability. I wouldn't set out to be making 150K using Rust just yet and there are certainly lots of legacy apps that will fetch top dollar because because nobody else can deal with them but they need maintenance. So when I say switch to C# might make people happier I'm not saying "all Java devs should switch careers" but rather as a business decision if I was granted the opportunity to pick a technology, I'd use C# over Java most of the time. And I say it as an interesting language to check out if you spend lots of time in Java. Any worthwhile dev is polyglot.

I think you're confusing the two rankings. The first one, measuring popularity, as far as I understand takes amount of hits across 25 different search engines. The second one, measuring employability, finds hits on a job search site.
 

Koren

Member
Part of it is just how well do you understand the operating system? If you have a weird crash in the middle of nowhere, how do you trace that back to a heap corruption? Nowadays the answer might be "run it under ASAN", but what if it's in a core dump?
I see... That may be interesting, although I admit I never really have a lot of interaction with the OS, at least not the ones that make the code crash. That really depends what you're coding, I guess.

It's hard to rebut the argument "I don't like this". I just want people to try, or at least consider things before they adhere to a forum consensus as dogma.
I agree... I support the idea of learning at least a language each year (I often learn two, in fact, a real one and a toy one for fun). Even if you don't use those afterwards, it broadens your views.

But once you've worked with it (and I've been involved in Java teaching, trying to show enthusiasm for it ^_^ ), there always will be languages you like and other you dislike. For good or bad reasons...

Then, all depends on what they allow you to do. I'm not fond of Python (although I don't dislike it as much as 10 years ago) but it's often really nice to do quick'n dirty code, so I use it quite often.

It's just I really don't like Java syntax and paradigms, and I don't see a single situation were I can't think of an alternative I prefer (Android development may be an exception to some extent, but I think native Android is Java++ levels of unbearableness) Should Java disappear in the language limbo, I personally wouldn't shed a tear... and (my) world would be a better place.

For a bunch of intelligent people, developers seem to enjoy demonizing languages. Why is that?
I'd say half of this at least is a joke, a way to vent frustration. I think that more than often, if you're against a language, you've practiced it, suffered because of it, and try to recover.

If someone asked me if they should write a distributed system in Brainfuck, I'd tell them it's probably not the right tool for the job, since it would be difficult and doesn't have the library support needed to get it done in a reasonable amount of time. But should I dissuade them from learning the language? Of course not.
Well, I agree...

Learn Java (and Brainfuck too).

And use Java as often as Brainfuck afterwards ;)

Easily deployable and performant application that can rely on decades of libraries? Scala or Java.
If Scala can be a substitute to Java, what kind of use I would have for Java? ;)


That depends a bit on who you ask, and I say that doesn't mean much anyway.

According to the same site, it was the language that saw the highest increase in rating in 2015.
They also produced this graph:
https://upload.wikimedia.org/wikipedia/commons/f/f8/Tiobeindex.png

short-term increase may not be the best indicator...
 

Koren

Member
I think you're confusing the two rankings. The first one, measuring popularity, as far as I understand takes amount of hits across 25 different search engines.
Hits could very well be people asking for help more often because the language is awful ;)
 
Have another interview tomorrow for an entry-level programming job. The guy recommended I go over OO concepts and some web principles. I am kind of thinking what would would be appropriate white board questions. Seems like finding duplicates and counting them in an array, comparing two arrays and finding common values, and reversing a String are very common. Not really sure what else to go over. I will probably go over traversing through a Binary Tree.

Some basic OO principles: encapsulation, polymorphism, inheritance, abstraction. Encapsulation means objects are self-contained. Polymorphism refers to using an interface to create multiple classes. Overriding is done at run time and it has same name and parameters but different code. Overloading is when you have the same name but different parameters or arguments. Inheritance is when a child class inherits the properties of the parent while having the means for containing unique properties as well. Interfaces are method signatures with no implementation. Abstract classes are similar but at least 1 method is not implemented, but some can be. Umm..interfaces are preferred (to me) over abstract classes because an abstract class can only extend once while you can create multiple interfaces.

I am just rambling. The web stuff is a little troubling. He recommended jQuery and JavaScript and it would be 'cool to know Angular'. I am reviewing Angular now, just getting a fast introduction to it. SPA? Single page application. jQuery is a js library. Need to cram, lol.
 
So I purchased Head First Java from Amazon (.co.uk because Ireland is good enough to set up your HQ and get tax breaks but not good enough to actually have a dedicated website...blargh)

Lots of sellers wouldn't ship to Ireland but finally got one in the end. So I think I'm all set for a little while to dive into Java, TDD in general and also improve my overall programming.

Going to keep learning Python too and web development in general. Javascript is on hold for now as I can properly dive into it once I'm more solid overall.
 
Have another interview tomorrow for an entry-level programming job. The guy recommended I go over OO concepts and some web principles. I am kind of thinking what would would be appropriate white board questions. Seems like finding duplicates and counting them in an array, comparing two arrays and finding common values, and reversing a String are very common. Not really sure what else to go over. I will probably go over traversing through a Binary Tree.

Some basic OO principles: encapsulation, polymorphism, inheritance, abstraction. Encapsulation means objects are self-contained. Polymorphism refers to using an interface to create multiple classes. Overriding is done at run time and it has same name and parameters but different code. Overloading is when you have the same name but different parameters or arguments. Inheritance is when a child class inherits the properties of the parent while having the means for containing unique properties as well. Interfaces are method signatures with no implementation. Abstract classes are similar but at least 1 method is not implemented, but some can be. Umm..interfaces are preferred (to me) over abstract classes because an abstract class can only extend once while you can create multiple interfaces.

I am just rambling. The web stuff is a little troubling. He recommended jQuery and JavaScript and it would be 'cool to know Angular'. I am reviewing Angular now, just getting a fast introduction to it. SPA? Single page application. jQuery is a js library. Need to cram, lol.

They might also ask you about the DOM, JSON, Relational Databases (SQL), ReST, MVC design patterns. In case you want to look into those. I think you'll do fine! Good luck man! :)

Do you mind if I ask what is the main language the job will be using and what is the main language you know now? Does the job specialize in web development?
 
They might also ask you about the DOM, JSON, Relational Databases (SQL), ReST, MVC design patterns. In case you want to look into those. I think you'll do fine! Good luck man! :)

Do you mind if I ask what is the main language the job will be using and what is the main language you know now? Does the job specialize in web development?

I will look into all of those, the things you mentioned. Thanks!

I use Java. The job is a .NET position, but they have a big team, so they work on a bunch of stuff, so it's not limited to just .NET frameworks, if that makes sense. I know the transition won't be so bad, if I need to use C#. Java was just the language I learned in school.
 

JeTmAn81

Member
I will look into all of those, the things you mentioned. Thanks!

I use Java. The job is a .NET position, but they have a big team, so they work on a bunch of stuff, so it's not limited to just .NET frameworks, if that makes sense. I know the transition won't be so bad, if I need to use C#. Java was just the language I learned in school.

Yeah, C# and Java are really similar. Sometimes I can look at code and not even be able to tell which one it is. Definitely do as much cursory brushing up on those web concepts that you're not familiar with. Hopefully they will mostly test you on fundamentals that carry over to any programming job rather than just specifics of a language you might not have tons of experience in.
 
I'm trying to write a program that converts an integer into a string. But I'm having trouble because neither my textbook nor the professor really covered STOSB, and we're required to use it.

Code:
writeVal PROC
pushad

mov eax, [esp+40] ;number to convert to string
mov edi, [esp+36] ;tempString address
mov ebx, 10 ;will need to divide by 10

std

numToString:
	mov edx, 0
	div ebx ;divide by 10
	add edx, 48 ;add 48 to convert from int to ascii
	push eax
	mov eax, edx
	stosb
	pop eax
	cmp eax, 0
	jne numToString


mov edx, [esp+36]
call WriteString

popad
ret 8
writeVal ENDP

Numbers are entered 10 at a time into an array. They are then passed individually (looping through the array) to this procedure.

Any help would be appreciated. I'm getting really odd results.

Like entering: 15, 23, 222, 445, 232, 12, 112, 3, 21, 2

gives: 15
23
,222
,445
,232
212
,112
13
121
22

So I'm getting random commas, and just plain wrong numbers at certain points.
 
Top Bottom