AquaticSquirrel
Member
Head First Java
Thanks! I'll grab that one then.
Head First Java
Besides Head First Java you probably want another book, as HFJ only covers Java 5. One of these will probably be fine:Hey guys, looking for advice on a good book to learn Java.
I used to program in Java in college years ago so I do understand the basic gist but it's been a long time and now I think I need to get back into it. I can program in Python and a little Javascript too.
I have purchased paperback "The Pragmatic Programmer" and "Growing Object-Oriented Software Guided by Tests" from Amazon so I'll have them I think by the middle of next month.
I have "Clean Code" by Robert C Martin in PDF format and also Gang of Four Design Patterns in PDF.
So far I haven't read any of these yet so I'll spend the next year or so going through them.
Can anyone recommend a good general book for Java or I suppose what is generally considered the "best" Java book? I suppose I'm sort of a beginner because I haven't used it in a long time and only used it to a basic level.
This post is so jaded, haha.
Many thanks... I'll at least look at the points discussed there. I've never thought there would be enough material to write a book on debuggers.My two recommendations are windows specific, but they are here:
https://www.amazon.com/dp/0735662789/?tag=neogaf0e-20
https://www.amazon.com/dp/0321374460/?tag=neogaf0e-20
The second is more advanced (and also better, for that reason). There's a number of books that focus on other debuggers and other platforms, but I haven't read them. I think most of the techniques (especially in the second book) are applicable on other platforms if you have a good enough knowledge of the debugger's command set
Can anyone explain the hate for Java?
The only real issues I've found with it are GC at scale and verbosity.
Can anyone explain the hate for Java?
The only real issues I've found with it are GC at scale and verbosity.
Because it doesn't do anything better than other languages. Like usually languages have some sort of hook, something that they are designed to handle whether that be expressibility of certain concepts, approachability to novices etc. For Java it was the JVM which allowed you to run the same code on different machines, in 2016 plenty of languages have multi platform runtimes and they are faster than Java. But Java is also very slow evolving, it lacks many ergonomic features that have been added to contemporary languages and the tools used are still fairly primitive when you stack it up to things like Visual Studio. And finally while virtually all usable languages are open-source and many have nice developer communities, Java still has Sun which seems to want to litigate people for using Java APIs.
https://en.wikipedia.org/wiki/Java_performance#Comparison_to_other_languagesFor Java it was the JVM which allowed you to run the same code on different machines, in 2016 plenty of languages have multi platform runtimes and they are faster than Java. But Java is also very slow evolving, it lacks many ergonomic features that have been added to contemporary languages and the tools used are still fairly primitive when you stack it up to things like Visual Studio. And finally while virtually all usable languages are open-source and many have nice developer communities, Java still has Sun which seems to want to litigate people for using Java APIs.
Hard to know what you mean by ergonomic features, but Java 1.8 introduced some features similar to Scala that make working in it a bit more elegant.But Java is also very slow evolving, it lacks many ergonomic features that have been added to contemporary languages
IntelliJ IDEA is quite an impressive and feature-rich IDE. It's also the same as the core for Android Development Studio.and the tools used are still fairly primitive when you stack it up to things like Visual Studio.
First, there's http://openjdk.java.net/And finally while virtually all usable languages are open-source and many have nice developer communities, Java still has Sun which seems to want to litigate people for using Java APIs.
While all this is true, Java has quite an amazing ecosystem around it. While some other languages have it beaten in specific niches (Python is the obvious choice for scientific computing for example), Java can be used for pretty much anything if you want to.
I'm not here to be Java's defense force, I just find it wearisome that many people bag on a select few languages without making a valid case against them.
You could pretty much swap C# in any situation you use Java and be happier.
Why would I be happier? I'm open minded and curious. I haven't used C#, so I'm not familiar with the novel features it offers.
Many thanks... I'll at least look at the points discussed there. I've never thought there would be enough material to write a book on debuggers.
pythonWhat would be a good language to learn to get into machine learning?
Syntactically it's very similar but the language itself is just nicer and filled with a lot of useful sugar (Linq, nullable types, generators, async await). Visual Studio is the best IDE, period and natively supports C#. JetBrains also makes an awesome plugin for Visual Studio called ReSharper which if you love their Java stuff you'll love it too. The release cadence for new features is also faster.
It used to be a harder sell because it was Microsoft siloed but now that's it's open source and runs on Linux it should easily fill the niche that Java used to have there.
Then again, Java seems to be the most popular language in the world. According to the same site, it was the language that saw the highest increase in rating in 2015. Furthermore, it seems to be the second most employable language after SQL. Despite its shortcomings, I don't think saying that learning C# over Java is such a clear-cut case.
It's clear cut if you replace "employable" with "enjoyable"
readVal PROC
push ebp
mov ebp, esp
pushad ;push all registers to save
startAgain:
mov edx, [ebp+12] ;@address of buffer byte array
mov ecx, [ebp+8] ;size of buffer byte array into ecx
;read the input
mGetString edx, ecx ;invoke the getString macro to read in the users string
;set up registers for conversion
mov esi, edx ;move the address of buffer into esi
mov eax, 0
mov ecx, 0
mov ebx, 10
;load the string in byte by byte
ByteByByte:
lodsb ;loads from memory at esi
cmp ax, 0 ;check if we have reached the end of the string
je finished ;if so jump to finish
;check the range to make sure char is a int in ascii
cmp ax, 48 ;0 is at 48
jb error
cmp ax, 57 ;9 is at 57
ja error
;adjust for value of digit
sub ax, 48 ;-48 for value of digit
[B]1. xchg eax, ecx ;place character value in ecx[/B]
mul ebx ;multiply by 10 for correct digit place
jc error ;jmp if carry flag is set meaning overflow pg. 191 textbook
jnc errorFree ;jmp to errorFree if no carry flag is set pg. 191 textbook
error:
mDisplayString errorMessage
jmp startAgain ;start over bc invalid input
errorFree:
add eax, ecx ;add the digit (in correct spot) to the total of int
xchg eax, ecx ;exchange for the next loop through
jmp ByteByByte ;examine next byte
finished:
xchg ecx, eax
[B]2. [/B] [B]mov DWORD PTR buffer, eax ;move eax to the pointer to save the int in passed [/B]variable
popad ;restore all registers
pop ebp
ret 8
readVal ENDP
I'm not arguing enjoyment, I'm arguing employability and usage.
Then again, Java seems to be the most popular language in the world. According to the same site, it was the language that saw the highest increase in rating in 2015. Furthermore, it seems to be the second most employable language after SQL. Despite its shortcomings, I don't think saying that learning C# over Java is such a clear-cut case.
It's clear cut if you replace "employable" with "enjoyable"
It is incremental, but a system whose GC settings are not tuned correctly can still end up spending an inordinate amount of time performing GC. Consider an application that regularly receives large transient blobs of data. If your GC strategy cannot handle a message of that size, it will go straight into your tenured working set which is only cleaned during a mixed GC event.Java has GC at scale problems? I thought Hotspot was incremental? And someone has some kind of fancy concurrent GC for it, too.
So when I say switch to C# might make people happier I'm not saying "all Java devs should switch careers" but rather as a business decision if I was granted the opportunity to pick a technology, I'd use C# over Java most of the time.
Gotcha. Care and caution. In that case I would either move the blob off heap, or I would increase heap size to be more appropriate for the expected working set.It is incremental, but a system whose GC settings are not tuned correctly can still end up spending an inordinate amount of time performing GC. Consider an application that regularly receives large transient blobs of data. If your GC strategy cannot handle a message of that size, it will go straight into your tenured working set which is only cleaned during a mixed GC event.
Yes. But note the line I have marked 3 above. It unconditionally jumps back above. So the next iteration through this loop, ECX will not be 0.I can't wrap my head around this MASM assembly code...
Would anyone mind answering/explaining my bolded questions?
Code:readVal PROC push ebp mov ebp, esp pushad ;push all registers to save startAgain: mov edx, [ebp+12] ;@address of buffer byte array mov ecx, [ebp+8] ;size of buffer byte array into ecx ;read the input mGetString edx, ecx ;invoke the getString macro to read in the users string ;set up registers for conversion mov esi, edx ;move the address of buffer into esi mov eax, 0 mov ecx, 0 mov ebx, 10 ;load the string in byte by byte ByteByByte: lodsb ;loads from memory at esi cmp ax, 0 ;check if we have reached the end of the string je finished ;if so jump to finish ;check the range to make sure char is a int in ascii cmp ax, 48 ;0 is at 48 jb error cmp ax, 57 ;9 is at 57 ja error ;adjust for value of digit sub ax, 48 ;-48 for value of digit [B]1. xchg eax, ecx ;place character value in ecx[/B] mul ebx ;multiply by 10 for correct digit place jc error ;jmp if carry flag is set meaning overflow pg. 191 textbook jnc errorFree ;jmp to errorFree if no carry flag is set pg. 191 textbook error: mDisplayString errorMessage jmp startAgain ;start over bc invalid input errorFree: add eax, ecx ;add the digit (in correct spot) to the total of int xchg eax, ecx ;exchange for the next loop through [B]3. [/B] jmp ByteByByte ;examine next byte finished: xchg ecx, eax [B]2. [/B] [B]mov DWORD PTR buffer, eax ;move eax to the pointer to save the int in passed [/B]variable popad ;restore all registers pop ebp ret 8 readVal ENDP
1. Why is there an exchange call there? 0 was placed in ecx. So wouldn't the following mul ebx simply result in 0? Instead of multiplying the value that was in eax by 10?
2. Why is DWORD PTR buffer used here? Earlier on in this code the author had typed " mov esi, edx ;move the address of buffer into esi"
so the address of the buffer should be in esi, right? Why then use buffer? I've tried subbing this line out with DWORD PTR [esi], or [esi] but that isn't moving the value in eax into the buffer.
The no-operands form provides “short forms” of the byte, word, and doubleword versions of the LODS instructions.
Here also DSE)SI is assumed to be the source operand and the AL, AX, or EAX register is assumed to be the destination
operand. The size of the source and destination operands is selected with the mnemonic: LODSB (byte
loaded into register AL), LODSW (word loaded into AX), or LODSD (doubleword loaded into EAX).
After the byte, word, or doubleword is transferred from the memory location into the AL, AX, or EAX register, the
(E)SI register is incremented or decremented automatically according to the setting of the DF flag in the EFLAGS
register. (If the DF flag is 0, the (E)SI register is incremented; if the DF flag is 1, the ESI register is decremented.)
The (E)SI register is incremented or decremented by 1 for byte operations, by 2 for word operations, or by 4 for
doubleword operations
while (*buffer != 0) {
// do something
buffer++;
}
Gotcha. Care and caution. In that case I would either move the blob off heap, or I would increase heap size to be more appropriate for the expected working set.
It's hard to rebut the argument "I don't like this". I just want people to try, or at least consider things before they adhere to a forum consensus as dogma.
For a bunch of intelligent people, developers seem to enjoy demonizing languages. Why is that? Demonizing is myopic and misleading, especially for people that haven't gotten a chance to try things out. If someone asked me if they should write a distributed system in Brainfuck, I'd tell them it's probably not the right tool for the job, since it would be difficult and doesn't have the library support needed to get it done in a reasonable amount of time. But should I dissuade them from learning the language? Of course not.
Every language I've used in my career has some quality or qualities that shine. If I want to quickly write a script, especially if parsing text, I'll use Perl. Embedded and/or high performance needs? C/++. Easily deployable and performant application that can rely on decades of libraries? Scala or Java.
I'm not saying that these are the only languages that can fit into these niches. They're just the ones that I am most familiar with. If you told me you're more comfortable scripting with Ruby or Python, who am I to tell you that is wrong?
I implore you to reconsider responding to the question, "How can I solve X with language Y" with "Don't".
Yes. But note the line I have marked 3 above. It unconditionally jumps back above. So the next iteration through this loop, ECX will not be 0.
Pay attention to the semantics of the LODSB instruction.
http://www.intel.com/content/dam/ww...r-instruction-set-reference-manual-325383.pdf
Look on page 591.
So after LODSB executes, ESI will not contain the address of the buffer anymore. If you imagine a loop like this:
Code:while (*buffer != 0) { // do something buffer++; }
You can think of ESI as containing the value of "buffer" in this example. It increases each time through the loop.
Saved.I highly recommend checking out this post if you find the topic of GC interesting: http://product.hubspot.com/blog/g1gc-fundamentals-lessons-from-taming-garbage-collection
Parallelization will be a very complementary class to distributed systems. Drop the first or the last. If you take the last course, it will set you up very nicely for a career in operations. Potentially.Could I also ask for some course advice? I'm trying to trim my coursework, and I will need to drop a few courses.
Here's some that I'm considering: a course in optimization, distributed systems, parallelization and concurrency, advanced OS architecture. Currently, I'm leaning towards the first three, and dropping the fourth.
I don't really know how useful any of these courses are. I'm choosing parallelization because multi-cores are important, but I don't even know if I will ever need to write software using multiple cores (or if I'm even smart enough to do that). I keep seeing distributed systems popping up in job postings, so I'll take it even though I don't really know what it is (something about networks of computers doing a task). I'm dropping advanced OS architecture because I highly doubt I'll need to know that much detail about Windows/Linux/whatever.
Could I also ask for some course advice? I'm trying to trim my coursework, and I will need to drop a few courses.
Here's some that I'm considering: a course in optimization, distributed systems, parallelization and concurrency, advanced OS architecture. Currently, I'm leaning towards the first three, and dropping the fourth.
I don't really know how useful any of these courses are. I'm choosing parallelization because multi-cores are important, but I don't even know if I will ever need to write software using multiple cores (or if I'm even smart enough to do that). I keep seeing distributed systems popping up in job postings, so I'll take it even though I don't really know what it is (something about networks of computers doing a task). I'm dropping advanced OS architecture because I highly doubt I'll need to know that much detail about Windows/Linux/whatever.
Edit: Forgot to mention our school has a stream for machine learning. I was initially interested; however, speaking with employers, they are essentially hiring PhDs in machine learning and MSc in Data Science for anything related to data mining. So for me to waste precious courses on ML at an undergrad level is a complete waste of time.
Well, you said learning Java over C# isn't such a clear case yet C# is directly behind it in on your list meaning it's still extremely prevalent. Anecdotally you will have no problem getting a job using either language if you're slightly competent so why settle for Java if you don't like it?
The metric seems to be most available positions, not popularity. But note that the quality of a language isn't really related to employability. I wouldn't set out to be making 150K using Rust just yet and there are certainly lots of legacy apps that will fetch top dollar because because nobody else can deal with them but they need maintenance. So when I say switch to C# might make people happier I'm not saying "all Java devs should switch careers" but rather as a business decision if I was granted the opportunity to pick a technology, I'd use C# over Java most of the time. And I say it as an interesting language to check out if you spend lots of time in Java. Any worthwhile dev is polyglot.
I see... That may be interesting, although I admit I never really have a lot of interaction with the OS, at least not the ones that make the code crash. That really depends what you're coding, I guess.Part of it is just how well do you understand the operating system? If you have a weird crash in the middle of nowhere, how do you trace that back to a heap corruption? Nowadays the answer might be "run it under ASAN", but what if it's in a core dump?
I agree... I support the idea of learning at least a language each year (I often learn two, in fact, a real one and a toy one for fun). Even if you don't use those afterwards, it broadens your views.It's hard to rebut the argument "I don't like this". I just want people to try, or at least consider things before they adhere to a forum consensus as dogma.
I'd say half of this at least is a joke, a way to vent frustration. I think that more than often, if you're against a language, you've practiced it, suffered because of it, and try to recover.For a bunch of intelligent people, developers seem to enjoy demonizing languages. Why is that?
Well, I agree...If someone asked me if they should write a distributed system in Brainfuck, I'd tell them it's probably not the right tool for the job, since it would be difficult and doesn't have the library support needed to get it done in a reasonable amount of time. But should I dissuade them from learning the language? Of course not.
If Scala can be a substitute to Java, what kind of use I would have for Java?Easily deployable and performant application that can rely on decades of libraries? Scala or Java.
That depends a bit on who you ask, and I say that doesn't mean much anyway.
They also produced this graph:According to the same site, it was the language that saw the highest increase in rating in 2015.
Hits could very well be people asking for help more often because the language is awfulI think you're confusing the two rankings. The first one, measuring popularity, as far as I understand takes amount of hits across 25 different search engines.
Have another interview tomorrow for an entry-level programming job. The guy recommended I go over OO concepts and some web principles. I am kind of thinking what would would be appropriate white board questions. Seems like finding duplicates and counting them in an array, comparing two arrays and finding common values, and reversing a String are very common. Not really sure what else to go over. I will probably go over traversing through a Binary Tree.
Some basic OO principles: encapsulation, polymorphism, inheritance, abstraction. Encapsulation means objects are self-contained. Polymorphism refers to using an interface to create multiple classes. Overriding is done at run time and it has same name and parameters but different code. Overloading is when you have the same name but different parameters or arguments. Inheritance is when a child class inherits the properties of the parent while having the means for containing unique properties as well. Interfaces are method signatures with no implementation. Abstract classes are similar but at least 1 method is not implemented, but some can be. Umm..interfaces are preferred (to me) over abstract classes because an abstract class can only extend once while you can create multiple interfaces.
I am just rambling. The web stuff is a little troubling. He recommended jQuery and JavaScript and it would be 'cool to know Angular'. I am reviewing Angular now, just getting a fast introduction to it. SPA? Single page application. jQuery is a js library. Need to cram, lol.
They might also ask you about the DOM, JSON, Relational Databases (SQL), ReST, MVC design patterns. In case you want to look into those. I think you'll do fine! Good luck man!
Do you mind if I ask what is the main language the job will be using and what is the main language you know now? Does the job specialize in web development?
I will look into all of those, the things you mentioned. Thanks!
I use Java. The job is a .NET position, but they have a big team, so they work on a bunch of stuff, so it's not limited to just .NET frameworks, if that makes sense. I know the transition won't be so bad, if I need to use C#. Java was just the language I learned in school.
writeVal PROC
pushad
mov eax, [esp+40] ;number to convert to string
mov edi, [esp+36] ;tempString address
mov ebx, 10 ;will need to divide by 10
std
numToString:
mov edx, 0
div ebx ;divide by 10
add edx, 48 ;add 48 to convert from int to ascii
push eax
mov eax, edx
stosb
pop eax
cmp eax, 0
jne numToString
mov edx, [esp+36]
call WriteString
popad
ret 8
writeVal ENDP