/dpt/ - Daily Programming Thread

Morphism edition

>previous thread
No such thing.

What are you working on, Cred Forums?

Other urls found in this thread:

blog.burntsushi.net/ripgrep/
stackoverflow.com/questions/7687082/c-memory-allocator-and-strict-aliasing
z2legacy.wordpress.com/2015/12/09/crashcouse-001-hello-world/
hackage.haskell.org/package/containers-0.5.8.1/docs/Data-Map-Lazy.html
github.com/MasterZean/z2c
en.wikipedia.org/wiki/String_interning
github.com/googlesamples/android-ndk/tree/android-mk/hello-gl2
twitter.com/NSFWRedditImage

Programming sucks.

Your mom's an excellent programmer.

kpop is programming related

first for pair programming

>Rust programmers straight up lie in benchmarks to make it seem faster than it actually is

programming is the best

Source for that claim? I'm interested.

stay in denial
blog.burntsushi.net/ripgrep/

>be a rust programmer
>call your program regret

cute :3

"5" + 2 // returns 52 because 2 is converted to "2"
"5" - 2 // returns 3 because "5" is converted to 5

I'm guessing this is JS.

what language?

real JS programmers know the struggle

>convert to first argument
You say that, but it is C logic.

Can be done in any language which allows you to name your own functions + or -

Yeah. I'm learning JS.

You say that, but if you'd spent more time looking at the post I responded to, you'd see that the behavior isn't consistent.

C doesn't implicitly cast integers to strings, or strings to integers. It treats strings as pointers, and allows implicit casting between floats and ints, but it doesn't have anytwhere near JS' level of retardation.

another funny one
10000000000 == 10000000001
evaluates to true because of implicit floating point conversion.

no shit, he clearly wasn't making fun of the concept of operator overloading, but rather a language's default behavior

>Rust programmers use unsafe in benchmarks then say people should use Rust since it's safe
kek

my scheme evaluator in scheme passed 10 out of the 12 example test cases we were given and i think i'm going to have to say it's good enough. i have 5 more days but i don't see myself making those last 2 cases work. it doesn't help that debugging in scheme is ass cancer

part of my rationale is that my class is full of normies and i think it's likely mine has more functionality than anyone else's will

C sometimes allows implicit casts, but its far more consistent. Besides, as the other guy says, it wouldn't convert a string to an integer. It'd increment the pointer to the string by 2 causing an overrun, if it didn't just cause a compiler error.

What are the recommended resources from which to learn C#? I am familiar with Java and C++.

endo

outo

exo*

At least Javascript has consistently retarded type coercion. For C it is either implementation-dependent or undefined.

>type coercion
>i said x, but do y instead without telling me
why does this exist?

I think C's redeeming factor is that it's consistent in its conversions. What JS will do is anyone's guess.

I mean, it makes sense if you understand operator overloading; but it can still catch you off guard.

it makes sense if you understand the reason behind the retardation, but it still doesn't make it not retarded

Is Python a meme language?

Stupid frogposter

No, it's a constructed one.

It's not even a meme, it's just sad. Sad and terrible.

how do I get my code that colorful?

It's like the training wheels without the bike.

CCCC type system

or haskell

crayons

it looks like atom. its part of the botnet

No it doesn't

what if I want a colorful c++?

In what way?

no u

>when you discover a fatal bug that utterly destroys your psyche and launches you into a state of deep depression for 3 minutes until you think up a glorious solution

Is there a better feel?

Not possible

but crayons aren't free as in freedom.
unless I steal them from the market.

It's too easy and there's no compiler.

Why is labVIEW event handling such shit?

>main program loop with a glorified switch structure that cannot capitalize on the parallelism NI so boasts about
>use queues and notifiers to offset some of that shit elsewhere
>no callback functions gotta have a second eternally running while loop to listen in
>nigga whose code I'm overhauling put a second event structure entirely against NI's recs.

>It's too easy

Thank you.

my university has a machine learning course, and a graduate level AI course. but the AI course has really low requisites and doesn't require machine learning (and i also saw it's a requirement for a "game development" cert so that makes me think it's not that hard). think it'd be ok to take them in the same semester?

It's a stupid language for stupid people.

Sure, why not?

i'm concerned machine learning may just be an easier version of the AI class. aren't they the same thing?

>operator overloading;
harmful as fuck tbqh
I have no idea why it's even allowed

No...

Just like variable naming

How is it any different than function overloading?

IT FOCKING WORKS

__declspec(noinline) static int EntryPoint() {
unsigned long address;
unsigned long base_address;
LDR_MODULE *module;

// Get current address and pointer to Process Environment Block (PEB)
__asm {
call L1
L1 : pop address
mov eax, dword ptr fs : [0x30]
mov module, eax
}

module = (LDR_MODULE *)((PEB *)module)->LoaderData->InLoadOrderModuleList.Flink;
while(module->BaseAddress) {
unsigned long base = (unsigned long)module->BaseAddress;
unsigned long difference = address - base;

if (base + difference == address) {
base_address = base;
break;
}
module = (LDR_MODULE *)module->InLoadOrderModuleList.Flink;
}

// Jump to the original entry point
__asm {
mov edx, base_address
mov eax, OEP_SIG // Original entry point will be stored here
or edx, eax
jmp edx
}
}


This function is written to an .exe and the entry point is modifed to point to it. It gets past ASLR-enabled executables and then jumps back to the original entry point

how are operators different from functions?

$(OBJECTS): $(INPUT)
@mkdir -p $(OBJ)
$(CC) $(CFLAGS) $(DEBUG) -I$(INC) -c $< -o $@

How do I get make to read past the first file in INPUT when using automatic variables?
It's compiling all my object files from the same source file.
gcc -O2 -ansi -g -Wall -Wextra -Iinclude -c src/one.c -o obj/one.o
gcc -O2 -ansi -g -Wall -Wextra -Iinclude -c src/one.c -o obj/two.o
gcc -O2 -ansi -g -Wall -Wextra -Iinclude -c src/one.c -o obj/three.o

I already checked INPUT, they're all unique files.

k i figured a not absolutely ugly way to do what i wanted to do last thread
loli = list([dict({'loli':'the best'}),dict({'lolicon':'saikou'})])
print(('{}'*len(loli)).format(*loli))

they are really broad terms. i could see them not overlapping that much
talk to someone at your school. if you don't know any faculty then start knowing. that's much more important than any class

Where's that fucking bird? I want my assignment now

the gf 3rd from left

>mfw got the basic interface from my school assignment working with exceptions and switches and all that shit
I'm the shipping guy from yesterday if anyone happens to remember. I haven't felt so proud in a while.
Now I have the nastiest part left to do which involves setting up the simulation itself and making the formulas to modify the user cargo and give recommendations based on it.

>exceptions
As expected from a spongefag

forgot to mention i wanted the keep the dict properties inside of them too, so i could access their arguments items

It makes sense in Ruby, where everything is an object, so there isn't really a such thing as primitive operations... in most other languages, I guess it's just a convenience.

in Haskell the difference between a two+ parameter function and an infix operator is just the symbols you use
and even then you can convert between the two

a `myFunc` b
(£$"^) a b

Spongebobs are better than animefags and frogposters.

so, any non null pointer will be a truthy value?

What's wrong with handling exceptions?

please hang yourself
preferably tonight

>Spongebobs are better than animefags

Cfags think that exceptions are fundamentally flawed.

>dislike cute girls
>tells anyone who likes them to go kill themselves

I finally see the light in regards to using restrict and strict aliasing. The rules seems mostly straightforward.

The only question I have is why do people have problems with strict aliasing in general? Why would you dereference a pointer you have casted into another type and call that perfectly valid code. The only exception I can find is with void pointers possibly.

>cfag
Haskellfag here

Cfags and Haskellfags are both awful. They deserve each other.

just sit in on the class i do that all the time. especially for my electronics professor because he also happens to be the chair and worked for defense contractors as an engineer. i've sat in on most classes he teaches but only received credit for one. no professor legitimately cares if you sit in and if they do they have a serious mental issue.

I've downloaded atom and it is really pretty. but where is the compile button?

Is OBJECTS a list of .o names and INPUT a list of .c names?
If so your rule doesn't really make much sense, you're telling make to match something like:
one.o two.o three.o: one.c two.c three.c
and then $< picks out the first one, one.c and that's it.

You're going about things completely the wrong way. Using explicit rules by trying to use variables as the targets and dependencies is not gonna work.
If you want make to automatically build a bunch of .o files from .c files you should either use pattern rules or the implicit rules (read about them in some makefile tutorial)

>tfw both cfag and haskellfag
I'm not smug BTW.

Are you the chosen one?

Don't bully user just because he's a homosexual.

Hey, C isn't awful. It's a great teaching language given its ability to expose low level functionality in a simplistic manner. You do not have to explain much in terms of language constructs, and can focus on the basics of computer architecture as you teach the student the basics of software development.

Alt tab to your terminal
Type "make"

Switch to TypeScript, throws compile-time errors for things like "5" - 2

Or get in the habit of explicitly converting your variables:

var x = "5";
var y = +x - 2;
var z = (""+x) + "2";
// etc.


As short-cuts for conversion:

+x // number
""+x // string
!!x // boolean
!!+x // safer boolean (converts literal string "0" to false)


While on the subject of shortcuts:

x|0 // round down

Operator overloading is sweet, if you don't go full retard with it.

t. writing a matrix library

Please respond. I am legit curious as to what situations in C you would dereference a pointer you have casted from another point for other than void pointers. And is dereferencing void pointers common at all?

how do I into restrict?

const restrict struct ayy * lmao;
or
const struct ayy * restrict lmao;

what do each mean

((char*) anInt)[0]

Memory pools. I haven't implemented one in C, and I'm more familiar with C++, but you can't use operator new combined with templated types in C, as far as I know.

>Alt tab to your terminal
>Type "make"
I alt tabbed, but it went to chrome. How to proceed?

Try alt + f4, the alt tab thing is a common bug on linux distributions.

I didn't say that C was bad, but Cfags are undeniably bad. If you could turn smugness into electricity, they'd provide a nearly infinite source.

Why Linux doesn't simply just works?

I'm pretty sure they'll both work, but every example I'm seeing uses the second. If the first works, it is equivalent to the second. Unlike const, restrict shouldn't have two possible interpretations (pointer to const vs const pointer)

Char pointers are legal though, so that's not much of a concern.

Only the second call is valid. The first one isn't allowed since it doesn't apply to the pointer and restrict only works on pointers.

Does it have to be of type void *? I was under the impression that memory pools are mostly typed.

But yeah, implementing a generic memory pool would be hard without breaking strict aliasing. If C can improve its generic programming facilities, I suspect there wouldn't be as much of an uproar over strict aliasing at all.

well in C++, you can just use templated types, as an example:

template
void foo(T bar){
// do stuff
}


Then when compiling, it instantiates the template function with the proper type T, while maintaining type safety, producing a different function foo for each type its called with.

nope, thats why more people use windows.

Actually, I looked more into the rules. Apparently, as long as you only work with void pointers inside the memory pool and don't give it a type until you give it to the user, it seems like it's apparently legal.

stackoverflow.com/questions/7687082/c-memory-allocator-and-strict-aliasing

how about in type declarations
would both
foo(struct bar restrict *) &
foo(restrict struct bar *)
work?

*PUSHES UP GLASSES*

hmm? What is this? This trifling nonsense? From an amateur, no doubt.

You call us smug. What you should be calling us is: creators. Because let me fill you on a little secret, friend. One which you would no doubt do good to learn about: C programmers run the world.

We are the programmers, we are the real coders, WE are the ones you call when your high level BS slows a system to a crawl. We are the ones who produce TRUE cross-platform code with ultimate speed. If we're arrogant, it is because we have earned that arrogance through extensive knowledge of the hardware we OWN, and of the passion we bleed.

Don't believe me? Take a look at linux, take a look at sockets, take a look at practically any useful embedded system in the past 30 years. And what do you see? You see C. And you see C because, unlike your meme language, C is eternal. We are not going away, we are not dying out. We will write great code until we breathe our last breaths. Because that's what a C programmer does.

I don't have time for this. The only thing I regret is that life doesn't have automatic garbage collection for unthinking pieces of shit like you.

*snorts and dismisses you*

*dereferences behind you*
Heh, nice try ... kid
*free's you*

The restrict qualifier has to apply to a pointer. So no, the second one won't work since it is applying to the struct and not the pointer of the struct.

Basically, you have to ensure restrict points to a pointer and that the thing the pointer is pointing at will never get accessed by anything else, only that restrict pointer itself and whatever pointer arithmetic you do on the restrict pointer itself.

New language, "Z2", much better than Rust, and Go, object-oriented, supports low-level code, runs fast, has an IDE, not an SJW

Here is sample code

Here is the official website

z2legacy.wordpress.com/2015/12/09/crashcouse-001-hello-world/

Is this copypasta? If so, I can't find it. It's amazing, though.

>object-oriented

How the fuck do you guys get ideas for things to program? That is without a doubt the hardest part about programming.

Why wouldn't you want OO? It's how you get polymorphism and code reuse.

>class
>def
>@main
>ACTUALLY USING C++ IOSTREAM SYNTAX WITH >> AND

>life doesn't have automatic garbage collection for unthinking pieces of shit like you
You gotta free them yourself.

so basically, put the restrict next to the dereferencing operator?

This is some next level bait. Well done.

>main inside class
Dissregarded.

Yes.

>That is without a doubt the hardest part about programming.
I wish.
Everything BUT the ideas is the hardest part about programming.

Programming is for chumps

Pair programming is fucking shit.
t. someone who has done pair programming

Loner

What does 'with' exactly does here?

pass session and url, waits atleast 10 seconds for the statements inside to be done, pass response then uses the attribute text finally return it

Most of my project ideas are just things that I think would be cool to toy around with (i.e. let's make a bootloader, or a simple virtual machine!). Many of them get abandoned relatively quickly, but if I learned anything playing around with them, I keep their source code tucked away in a directory somewhere. For the most part, I have little to show future employers other than my degree and a couple of completed projects that no one actually uses (I think someone might be using my extension to mruby-socket though, since it has apparently gotten more commits from someone else with regards to its Win32 functionality), but I am hoping my coming master's degree and a couple of connections will make up for the lack of substance.

Why don't you build a pair of things that will look appealing in a portfolio?

>def
>@
>streams
>forced OOP
100% garbage.

i usually just try to fix things that bug me. there must be things that bug you
i wanted to start blogging again and didn't like what was out there. that's one project
i read a lot of papers and didn't like any of the reference management tools i found, so there's another

i'm not saying this is lucrative but i find it really hard to stay focused on this stuff if i'm not addressing some personal annoyance

You use unsafe on 5% of the code (tight loops) and the normal rust on 95%. The attack surface is reduced to that well studied 5% and the common error get checked on the 95%.

This is good economics for security, as getting the same ratio for say C is costly and error prone.

Haskell programmer here, learning posix/linux C API. I was doing a server/client combo that pass file descriptor through unix sockets. The thing is that the API usage isn't clear just from the headers, you have to read the man pages examples, struct chasing the headers to *know* how this procedures are to be used. Contrasting this with haskell where the types usually tell you how to fit each parts of your problem.

Any tips for this? I am interested on C for the memory flipping aspects (and OS design) not for performance and the likes

>Is there a better feel?

Yes. I don't suffer from autism.

THICC

If you're interested in the memory aspect, you can always use Haskell's Foreign module.
But simple things like let bindings and closures in Haskell have memory implications

How is automatetheboringstuff dot com as an intro to python as a programming noob?

Why would you learn Python?

to automate boring stuff?

You can do that in any programming language.
Why not start off with a good programming language?

why are you implying python is bad

>implying
It IS bad

What's the best way to learn ruby then?

WHY is it BAD

>ruby
Did you not read my post?

Don't even know what would look appealing/have little motivation to care.

General lack of typing, dictionary based, OO features, __things_like_this, general messy interactions between features, overreliance on FFI, bad userbase, I could go on

that says nothing why its bad

I can understand why someone with difficulty understanding simple sentences might like Python

It's bad because it's easy and the normies like it reeeeee Haskell4Lyfe

>the one writing the code might be retarded so the language must be bad

Python and Ruby are duck typed languages that are really flexible for hacking up quick programs, but don't particularly run very fast. As the program size increases, however, the flexibility offered by both languages tends to encourage bad habits that in practice have shown not to scale too well. For automation, Ruby and Python are both nice, but you should not use them for serious programs.

Nice argument faggot, why not actually direct it at someone?
Too scared of being blown the fuck out?
Don't want to spill your starbucks coffee all over your new macbook?

did you measure the effect of dynamic binding on executable size?, actually did you do any dyn binding on the rust side?

>lack of typing
>OO features
>naming convention
>messy interaction between features
>over realiance on FFI
>bad userbase
literally nothing bad about the language but what u don't like about it

Yes I am scared of those things

u idt lrn 2 pogram

not him but here is how I see the difference use case of python vs compiled language

Being dynamic and having a REPL is a godsend when doing exploratory programming. I'll eventually ditch matlab/octave for numpy/pandas because not being constrained to be "just a math DSL" permits more freedom and you don't want to constraint you options when exploring.

Yet this is at the same time why it fails to do long maintained evolving software. Doing any sort of architecture change is difficult by it reliance on adhoc convention, which are a given if you want to support explorative programming. Thus when the requeriment change, you time to change the arquitecture will increase with each successive one

That is not to say it is bad, certainly its use case should be defined and not misconstructed.

>being dynamic is a godsend
No. A decent type system with good inference and implicits is MUCH better.

what is the standard unix way to send a signal at a specified point in time?

>morphism thread
fucking hivemind, or are you the one who owns this acpid rootkit, because you can have it back. lol

>I can't read more than half way into a sentence

yes, I've chased the rabbit down the rabbit hole and here I am, looking how fsync() is implemented...

I read the whole thing, I didn't need to reply to the rest.

>naming convention
>messy interaction between features
These are among the reasons I prefer Ruby over Python. They're not enough to make me hate Python though; just enough to make me not use it when Ruby is an option.

I am a haskeller (and doing a math major). Doing my algorithms for FFT you work on N basis with functions of Z_n to complex numbers. This bases sometimes have output on just the real numbers that have properties with the simetry of the transform matrix. Passing a bunch on (Num a) => ... is equally opaque as implicit casting, yet more curbersome to use in this case.

If you know little about your data, why bound yourself to a model? you gain nothing (as passing Num a arround) and the ergonomics are worse.

I would still like function composition on python/matlab though

>doing a math major
>simetry
I don't think I believe you

>What are you working on, Cred Forums?
Zybooks :')

You gain security, performance (when specialised), implicit documentation (through types), polymorphism, etc

I posted this in some previous DPT but it got lost, so I'll ask again.

I'm about to ditch SFML (lack of stable C++ ABI forcing me to compile my own versions constantly, huge size, etc.), but what do I replace it with?

SDL or GLFW?

Both have all the functionality I need. I just can't decide. SDL is more widely used and I've used it a few times in the past, but GLFW seems to be even smaller than SDL. Meaning, even less chance of something that can break.

english is not my first language sempai

...

> You gain security, performance (when specialised), implicit documentation (through types), polymorphism, etc
which are non requeriment when doing the exercises of the book that require such programs

> security
it is a throw away code

>performance
anything under half a second is instantaneous + no compilation times

> implicit documentation
this is a plus, certainly order of arguments tell you a lot but if you end using lawless classes (like Num a which is not monoid/ring/etc) then you lose some of it. Still a plus

> polymorphism
usually in this things I don't want parametric polymorphism, I want adhoc-polymorphism via good typeclasses. But this typeclasses aren't clear until I explore a little bit more of the problem which by then I already solved it and moved on

Anyone know of some consistent framelimit code? The one built into SFML is inconsistent as hell.
GLFW is pretty hard compared to SDL.

>security
Reduces the number of possible mistakes.

>performance
Depends what you're doing

>lawless classes
Not ideal but there are still implied laws that you can generally rely on.

>polymorphism
You can do ad hoc polymorphism with type families too, but generally you should try to avoid that where possible.
Parametric polymorphism with type classes is great though, I don't see why you wouldn't want it. Stuff like Complex and Ratio use it too.

You don't really need to be defining your own classes all the time. I mean more using existing classes.

>but generally you should try to avoid that where possible.
I mean ad hoc polymorphism in general, not just type families. If you can do something cleanly without it, it's probably better. Ad hoc polymorphism makes your code more rigid.

Yeah Rust, like Go, is entirely statically bound.

>I mean ad hoc polymorphism in general, not just type families. If you can do something cleanly without it, it's probably better. Ad hoc polymorphism makes your code more rigid.

Yes, unless you have *laws* which you can rely on. And those seem all already defined, you need to instance them

You mean it's hard to use?

Yeah, I tried it a while ago and fucked off to SFML after about a week.

>unless you have laws
That's not very ad hoc, which is my point.
It's parametric with constraints.

If all the different instances obeyed very different laws, that would be ad hoc. Like

type family T (b :: Char) :: Type where
T 'x' = Bool
T 'a' = Char
T 'z' = [String]
T 'y' = IO Integer

I don't think Haskell has type level char literals with data kinds, but you get the point

It's memory usage

what I want is a way to do:

send a SIGALARM only when the current time is greater than or equal to some time_t

I know you can do intervals using setitimer, but I don't know of a way to schedule signals with a given time

You're kinda opposite of me then. I'm trying to get away from SFML. SFML's got a nice API, but it isn't enough to save it.

And I can answer your first question, about framelimits: SFML's framelimit code is fucky. Very inconsistent. Enough to cause major frametime spikes even if you aren't rendering anything but a black screen. I guess one could write their own limit code and ignore what SFML has to offer, but I haven't bothered with that.

Been looking at job boards and everyone wants some UML expertise. Whats does Cred Forums recommend git gud at it?

>SFML
I went back to SDL but then lost all my motivation and became a permanent nodev

>SFML's got a nice API, but it isn't enough to save it.
I'm just drawing lines to the screen, nothing complicated.
>I guess one could write their own limit code and ignore what SFML
Yeah, problem is that if you try to look up code for framelimiters it's just a bunch of people assuming you're doing something complicated and saying that you shouldn't tie logic to framerate.

hackage.haskell.org/package/containers-0.5.8.1/docs/Data-Map-Lazy.html

>O(log n)
>O(log n)
>O(log n)
>O(n)
>O(n)
>O(n)

UML is completely useless.

Had an interview where the guy interviewing said he didn't know before he got the questions.

So what you're saying is you could've made up answers and changed UML forever?

thanks
It took me a while to understand the rule: dependency model, but now I get it.

with that in hand, i rewrote my build script
$(OUTPUT): $(patsubst $(SRC)/%.c,$(OBJ)/%.o, $(wildcard $(SRC)/*.c))
$(CC) -o $@ $^ $(LDFLAGS)

$(OBJ)/%.o: $(SRC)/%.c
@mkdir -p $(OBJ)
$(CC) $(CFLAGS) $(DEBUG) -I$(INC) -c $< -o $@

Is this correct counting in binary? (base2)
000 - 0
001 - 1
010 - 2
011 - 3
100 - 4
101 - 5
110 - 6
111 - 7
1000 - 8
1001 - 9
1010 - 10

yes

>special characters for arrows
Further proof Hascucks are just interested in making code that looks good and tipping their euphoria.

I'm using FiraCode, ~> has a pretty ligature

"0" && true // returns true and
!!"0" // returns true, but
"0" == false // also returns true

What's the type of >>= in this cute feminine notation?

>pretty ligature
Proving my point. You just want code to look pretty so you can wave it around to others, but the code is completely pointless.

>tfw I have to learn javascript soon and they're gonna cover memejs
pls no

exactly the same

Why do you want code to look ugly?
and the types are more descriptive this way too, the type of safeHead is "list to maybe" rather than "forall a. list a to maybe a", join' shows the monoidal nature of monads

ive got 2 char arrays with ones and zeros and want to xor them.

ive put them both under a for loop and am xoring each element but im not getting the right output because each element is being interpreted as an ascii character. how i fix?

What the fuck are you talking about?
What do you expect XORing chars to do?

cast it to uint8_t or use %u if you're using printf

>exactly the same
boring

Looks like it's implemented as a BST.

Are BSTs the G (data structure) OAT?

No, van Emde Boas trees are.

>only useful as an alternative to an array

out[i] = (uint8_t)a[i] ^ (uint8_t)b[i];

it's xoring properly but i want the elements in out to be ascii 1s and 0s.

so instead of '\x1' for one i want '1'.

Learning C atm. Is this a mistake?

Probably.

>Learning C
Good idea
>Using C as a primary language
Bad idea

No.

this is pretty edgy

They have their uses, but they are neither the greatest nor the worst data structures. log(n) times for most shit is pretty damn nice, but it's a lot less nice when you're making lots of L2 and L3 cache misses.

Everyone should learn C. I recommend it as a first language, but it's okay if you've been tainted with another first.

>need to literally see the 1's and 0's instead of knowing the xor worked based on the 8bit integer
well you'll have to look up (or preferably write) an int to binary function assuming you're using C.

If you're using C++ this becomes so much easier since you can just use an std::bitset and call to_string() on it.

nevermind i got this to work

out[i] = ((uint8_t)operandL[i] ^ (uint8_t)operandR[i]) + 48;

Hey me too

what do you want to do as a programmer in general?

3d gaymen

I really don't have anything specific what I would want to do. I just enjoy coding and solving problems.
I have recently gotten interested in low level stuff.

About software publishing and privacy policy, what do they mean by collect data?

What if i never see any of their info because all of the information it's stored on their devices/disk? Is that "collecting"?

What if i'm just filling forms for them and the cookies data is saved in their device? is that collecting?

what does Cred Forums think of using jython for a python plugin library for my application, which is written in java?

>lowlevel/games
well then you should be good.
If you said something like webshit you'd want to reconsider.

Should I learn Rust?

Is it a useful language/will stay a useful language for a while?

no

No.

Can somebody PLEASE fucking tell me how to link GSL headers to a program file?

Yes.

explain

explain

It's not very good.

Yes if you want to learn language barely anybody uses and which compiler is more annoying than haskells. You also get taste of c++ syntax while being able to feel superior.

I'm in 5th semester CS and I passed everything except programming lectures, I didn't even pass the first semester stuff like introduction to java and C

what is wrong with me
I feel like I should quit

What don't you get?

yes you should

C is something every serious programmer should learn.
But I wouldn't suggest you write the majority of your projects in it.

I don't know it simply doesn't work
homework is easy
but as soon as they give me 30 pieces of paper and want me to write code on it my brain won't let me do it, like YOU KNOW HOW TO DO IT BUT I WON'T LET YOU HA-HA-HA
and when I'm finally about to write the time is over

thank you

Don't worry.

The courses I didn't pass turned out to be the courses I loved the most when I went back to them to study real hard.

You'll get there. Study study study.

You have performance anxiety or some shit.
Outline your code on a piece of paper (in pseudocode). Study some basic patterns or idioms of whatever language you're doing.

This isn't hard, you're just studying wrong.

>Programing is fun they say
>Lets get a degree
>Lets become a consultant

What you actually encounter out there in the wild:
>Java, C#, C++ and Objective C / Swift
> Wildfly with hibernate
>.Net with hibernate
>Apps
>Data warehousing

That's literally what the majority of the industry does.

Just because entire world is boring doesnt mean programming cant be fun.

Does someone know how to check if an unsigned char is a Hex? 0 gets returned here for some reason.

Don't give up, you barely need any programming in CS anyway. As long as your math is good, you should be fine.

>What you actually encounter out there in the wild:
I'm so glad I didn't take the consultancy route, like 90% of my university friends. But money-wise, they make a hell of a lot more than me (embedded).

What the fuck is 'a Hex'?

hexadecimal number. $ ascii -h

thats spell you can use to turn someone in a frog, or a sheep

`session.get(url)` returns something that has an interface defined with __entry__ and __exit__ methods that behave like constructor and destructor. example for files: you would put fopen in the __entry__ and the fclose in the __exit__

You should learn elementary number theory.

An unsigned char can store an integer between 0 and 255.

An integer between 0 and 255 can be represented in binary, hexadecimal, octadecimal, 12-number system etc. Hexadecimal just a matter of representation.

When would it ever return 1?

Upon implementing proper namespacing in my rushed code (rushed due to my group not making a single usable line of code before deadline) my braindead group pulled variables that needed to be above the "scope" of the function that handles each round of a simple game into the "scope" of that function. Because we totally want to reset the amount of money the player has each time a new round starts, right?

What blows my mind is that these people are about to go out into a workplace. The course is 75 % done, and they still do shit like this. I have to hold their hands to do a git commit in terminal, and one wanted me to stay after class and help with l
logging "big text" to the console, genuinely wondering how others did that (think figlet/toilet).

Cool. So how do I loop through an unsigned char to see if its represented as a hexadecimal (e.g. 0x0F)?

it would return 1 if the if-statements in the while loop are false. Can you even read code?

Yea than those people get out and on Cred Forums and start spouting cs is a meme i cant get muh job.

>Cool. So how do I loop through an unsigned char to see if its represented as a hexadecimal (e.g. 0x0F)?
You don't, because any integer can be represented in hex.

0x0F is just 15. 15 is just 0x0F.

unsigned char value = 0x0F;
printf("%02X\n", value); // prints 0F
printf("%02d\n", value); // prints 15

Well my teachers beg to differ.. this is the assignement

The correct answer is "when c is 0 to begin with"

("0" && true) && ("0" == false)

True

Right, a function that checks if c is one of those letters.

You're talking about a string representation of hex you nigger.

int ishex(unsigned char c)
{
return (c >= '0' && c = 'a' && c = 'A' && c

the way I interpreted it was to check if c is a hexadecimal in the format 0-9 A-F

e.g. 0F (being 15)

Why should you ever use a paradigm that isn't OOP? Tell me a single example where OOP isn't enough to satisfy your needs. I'll wait.

> "0" == !"0"
true

You're a retard. An unsigned char is a NUMBER.

unsigned char a = 0x0F;
unsigned char b = 15;

printf("%d\n", a == b);

No shit.

#include
int ishex(unsigned char c) { return isxdigit(c); }

Multiple dispatch.

When creating a state-capturing closure.

>inb4 functors

As long as you now get that that's always true then it's all good

>Bitshift overloads
W E W
E
W

>pipe operator is bitshift
W E W L A D
E
W
L
A
D

still doesn't return 1.

Oh, unless you meant that its hex representation should literally just be one number and one letter, in which case you'd need to test if it was between 10 and 159 inclusive (0A to 9F, unless I'm a retard)

Where the hell is compiler code?

This repo is just a bunch of READMEs and TODOs.

github.com/MasterZean/z2c

Fucking typical Cred Forums project, all talk and only a logo and a fancy project name to show for it.

Just kill yourself.

Can you explain why you think it should return 1 here?

#include
#include
#include

void ishex(uint8_t c) {
printf("the number %c is a hex %hhx\n", c, c);
}

int main(void) {
ishex(getchar());
return EXIT_SUCCESS;
}

Because 0x0F is a hex.

this is c++ m8

> CrashCouse

are they even trying?

class Counter:
def __init__(self, start = 0):
self.count = start - 1
def __call__(self):
self.count += 1
return self.count

But ASCII 15 is not a "hex-symbol", to quote your assignment. Chars that are valid hexadecimal symbols are 48 ('0') to 57 ('9'), 65 ('A') to 70 ('F'), and 97 ('a') to 102 ('f').
So, if c = 0xF, it is not a valid hexadecimal symbol, but if c = 'F', it is.

>bitshift is pipe operator
W E W S O N N Y
E
W
S
O
N
N
Y

That's not a state capturing closure, that's a counter.

def createAdder(n):
def adder(x):
return n + x
return adder

addFive = createAdder(5)
print addFive(1)
print addFive(2)
print addFive(3)

>language doesn't let you shift an io stream by a string
A B S O L U T E

M
A
D
M
A
N

>bitshift

SHIFTL and SHIFTR are bitshift operators, user.

class Adder:
def __init__(self, n):
self.n = n
def __call__(self, x):
return self.n + x

addFive = Adder(5)
print addFive(1)
print addFive(2)
print addFive(3)

dont forget to great additional Printer class. Also you need Manager class which will call those. And finally you need to kill yourself.

Easy there, Ranjeet.

I dunno what is funnier, your inability to use "else if", or everyone replying to you who doesn't understand the difference between hexadecimal notation and hex character literals.

You don't need a while loop or to increment at all, the specification only wants to accept single chars, not null-terminated strings.
Change your conditionals to be inclusive instead of exclusive, makes the logic easier to follow.

Not going to completely spoon-feed you, so something like this:
if ((c is a digit) || (c is lowercase hex char) || (c is uppercase hex char)) {
return 1;
}
return 0;

When you aren't writing a small script in a largely pre-defined environment that maps directly to the real world, like a level in a game.

return (c is a digit) || (c is lowercase hex char) || (c is uppercase hex char);

>everyone replying to you who doesn't understand the difference between hexadecimal notation and hex character literals
i think the only one who's not getting it is the guy himself

So lets say i want to make some utility programs for android since it sucks. How can i do that while avoid java and downloading 500000000gb of bullshit.

Why would I ever use this over (+) 5?

you take a shart in mart

NDK

I have been designing a 16-bit ISA for fun. Nothing useful will likely come of this, but it would be something interesting to design an emulator around.

Features:

16 16-bit general purpose registers. These are more or less relegated to whatever purpose one wants them for. There is nothing special purpose, like an accumulator or data register.

Fixed-width, 16-bit instructions. There are only 65536 valid instructions, of which three quarters are dedicated to instructions which take immediates. 16384 of these are dedicataed to 4 instructions which take an 8-bit immediate, and 32768 instructions are dedicated to 2 jump instructions that each take a 14-bit, shifted and sign-extended immediate for relative jumps.

16-bit address space, with the option to bank switch any of 4 16KiB blocks of memory from a larger program ROM that can be up to 4 MiB in size. The last of these blocks of memory is dedicated to I/O, however, and should not be used for this purpose.

Extensible up to 15 co-processors. A special instruction allows the CPU to switch into a different mode and interpret instructions as a different processor. This could allow it to, for example, have one mode that understands floating point instructions, and one mode that understands integer instructions. It might also allow the processor to act as a bootstrapper for a 32-bit or 64-bit ISA

...

Yeah I'm bored.

Lad, thats how they train to become best pajeets they can.

>I have been designing a 16-bit ISA for fun.
Get a life, Ruby

(+5) doesn't mutate any state., whereas the counter does. If you get triggered by state go complain to , he's the one who wanted glorified currying.

0x0F is 15 is 1111. The only difference is in how you've written it. If you want a function that determines whether c is a hexadecimal or not, write a function that always returns true.

Operator overloading is not "OOP", try doing that in Java.

Also, try doing the following:

#include
#include
#include
#include

typedef std::function Callback;

void counterThread(Callback limit)
{
int value = 0;

while (!limit(value))
{
value++;
std::this_thread::sleep_for(2s);
}
printf("%d\n", value);
}

int main()
{
std::thread threads[3];

for (int i = 0; i < 3; ++i)
{
threads[i] = std::thread( [i](int j) -> bool {
return i == j;
});
}

for (int i = 0; i < 3; ++i)
{
threads[i].join();
}

return 0;
}

Redirects are not pipes m8.

If memestream languages actually thought anything through, they'd use the actual fucking pipe symbol, either once ('|') or twice ("||").
You can't pipe into stdin, the direction of the symbol is fucking meaningless, you either read from it or syntax error.

Typo there, should say

threads[i] = std::thread(counterThread, [i](int j) -> bool { return i == j; });

>not an SJW
Do people seriously decide which programming language to use based on their political beliefs?

yes lets create class add extra overhead to already slow language only so we can do
foo = 1 + pajeet

poo in the loo

I do have one, but I'm bored and it's the weekend, and I'm still not used to going to bed before 5 in the morning because it's sadly not summer anymore.

Pair programmer can be pretty good, but only if the two are competent.

>he thinks bash syntax is the only way of doing piping

learn Haskell

Still needs a java frontend.

NDK just lets you run C libraries, you still need something to call them.

Even the thinnest possible wrapper will require you to make android XML layouts.

>inb4 xamarin
That's even worse.

>z2legacy.wordpress.com/2015/12/09/crashcouse-001-hello-world/
>Z2 has a lot in common with C++.
Are you fucking serious?

When you compare two objects of custom classes (my question is for AS3 but I'm also curious about C# and Java) with an == operator, does it compare every element of that class?

Would it be faster to compare an element of the class (unique ID int) instead?

Operator overloading here is merely a syntactical convenience here. It's equivalent to calling addFive.__call__.

And all that's necessary to make that code OOPOO is adding
class Callback {
int i;
public:
Callback(int i) : i(i) {}
bool operator()(int j) { return i == j; }
};
and
threads[i] = std::thread(counterThread, Callback(i));
Of course, it's completely unnecessary to do it this way as opposed to function pointers, but the point is to illustrate capability, which doesn't get.

new bread:

My time isn't that worthless...

>She thinks a language that uses ">>" to mean two completely different things isn't trash.

But then you have to inherit Callback and create your own type every time you want the callback to do something custom.

Also, functors can't be anonymous and created inline, you have to declare them.

Of course this is example is constructed, but imagine you have an async messaging API that takes a callback, but you want to block until you get the response.

So the send message call looks something like this
std::function Callback;

void sendMessage(int destination, Callback handleResponse);


Then you could do something like this
bool getBool(int destination)
{
bool response = false;
std::mutex mtx;
std::condition_variable cv;

std::unique_lock lock(mtx);
sendMessage(destination, [&response, &cv, &mtx](bool r) {
std::lock_guard lock(mtx);
response = r;
cv.notify();
};)

cv.wait(lock, /* ... */);
return response;
}

In java, it checks if the references both point to the same object.
The contents are not inspected.

If you make 2 objects with the same constructor params and compare them with "==", the result is entirely dependant on optimisation levels.

Java will automatically intern string representations of low-value ints, so (new String("10") == new String("10")) is likely to be true.

However, (new String("1000000") == new String("1000000")) is likely to be false.

Because of the inconsistency, you should never compare references unless you actually want to check if two things point to the same object.

If you want to check the contents, check the contents.

is there no hope? I just googled and it seems annoying to set up.

Android installs apps from the play store.

The play store passes out apks.

Apks require java (a wrapper at the minimum).

Also NDK is a massive pain in the ass, it's somewhat automated now but they still recommend you only use it when absolutely necessary.

Damn there goes my hopes. Im sticking to desktop. Even if i were to use java i still dont want to downloads 50 different sdks, 100 emulators, gigantic ide and who knows what

Thanks for the info, it comparing reference makes a lot of sense.

The "10" with "10" being (mostly) true and "1000000" with "1000000" being mostly false, not so much.

It's not really that bad, the IDE installer is ~500MB at the moment.

The SDKs are quite large, but you only really need one at any given time.

Worst case really is you might need to keep two, your minimum supported API level and your target API.
Also the emulators are only really a problem if you actually care about testing on every possible target device (you really shouldn't).

Generally I just keep an x86 emulator of my minimum API, and test with a physical device running the target API.

The tools are fucking huge in comparison to simple text editors, but then even something like Qt is a fucking massive install.
Any graphical toolchain is going to be large, Android at least covers everything you need.

I may have actually gotten the string interning mixed up with Integer boxing, but the point is unless you know EXACTLY what the implementation is doing, you can never be sure.

en.wikipedia.org/wiki/String_interning

So its not that bad ha?

Hmm i might give it a try. Can you use java foreign language interface without ndk? I'd really rather write wrappers around C than do things with java,,.

The NDK is just toolchains for building C/C++ .so files for android device architectures (ARMv7, and maybe x86 if you care about android laptops).

The Android OS itself is basically a giant NDK project, just a shitload of java wrappers around a linux kernel.

I was under the impression that you can write entirely in C/C++ with NDK. It gives you access to OpenGL ES and such.

damn where has KISS principle gone? Most of apps that are paid in app store or have adds are super simple to make on desktop so i thought just make my own.
Fucking adds

Yes and no. In a large enough project, your native code will be doing most of the hard work so it'll make the java side seem tiny.

But for relatively simple utility projects, you're going to need a lot of boilerplate.

Here's an example of using native OpenGL: github.com/googlesamples/android-ndk/tree/android-mk/hello-gl2

In most cases, you can just import and use libraries to do almost anything, but at the expense of complicated non-customised initialisations and bloat.

Also they don't have a full implementation of the C++ Standard library iirc.

excuse me?

>there are people who still use OOP in current year

OOPfags are killing programming.