This brings up another thing. I've basically been moved into the position of team lead for this team (promoted at the end of last year). Now we're expanding pretty quickly and moving our head count from 3 to 6 developers. Any recommendations for good books or articles to read about being an effective manager without pissing everyone off? I'm going to have to onboard and train three new people pretty soon. These are all developers with as much or more experience than me, but I am the one with the institutional knowledge of our tools and systems.
One of these people is moving into a position more senior than me; we're expecting to hire someone with at least 10 more years of experience than me to basically be an architect-level individual contributor.
Edit: We have about 25 in-house programmers in my department and we're adding 3 more to the team I'm on.
I'd give my vote for Alison Green of Ask A Manager, myself - there's a lot of good advice there on how to manage and lead. (As well as some impressive stories of bug fuck crazy.)
This brings up another thing. I've basically been moved into the position of team lead for this team (promoted at the end of last year). Now we're expanding pretty quickly and moving our head count from 3 to 6 developers. Any recommendations for good books or articles to read about being an effective manager without pissing everyone off? I'm going to have to onboard and train three new people pretty soon. These are all developers with as much or more experience than me, but I am the one with the institutional knowledge of our tools and systems.
One of these people is moving into a position more senior than me; we're expecting to hire someone with at least 10 more years of experience than me to basically be an architect-level individual contributor.
Edit: We have about 25 in-house programmers in my department and we're adding 3 more to the team I'm on.
jesus christ do you make video games or something
Worse; I work for lawyers
are and there are 25 of you?!
bowen on
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
Project I'm working on now is pretty much all functional programming everywhere, front-end and back-end. This is the first time I'm being exposed to a code base that's pretty much all done with the functional style, and one thing I've had some trouble figuring out is where shit actually comes from. I see the functions being passed around, but finding where the data passed to the functions comes from I've found difficult, even in Java. The Javascript/React/Redux side I've found it even harder since it seems like tons of magic is going on. This kind of style seems a lot harder to onboard onto unless you are very familiar with the code base.
Project I'm working on now is pretty much all functional programming everywhere, front-end and back-end. This is the first time I'm being exposed to a code base that's pretty much all done with the functional style, and one thing I've had some trouble figuring out is where shit actually comes from. I see the functions being passed around, but finding where the data passed to the functions comes from I've found difficult, even in Java. The Javascript/React/Redux side I've found it even harder since it seems like tons of magic is going on. This kind of style seems a lot harder to onboard onto unless you are very familiar with the code base.
I had the same issue with Rails, even though it's not strictly functional. It is super easy to create your own DSL that invisibly passes around and populates everything. It's both very clever and completely impenetrable.
This brings up another thing. I've basically been moved into the position of team lead for this team (promoted at the end of last year). Now we're expanding pretty quickly and moving our head count from 3 to 6 developers. Any recommendations for good books or articles to read about being an effective manager without pissing everyone off? I'm going to have to onboard and train three new people pretty soon. These are all developers with as much or more experience than me, but I am the one with the institutional knowledge of our tools and systems.
One of these people is moving into a position more senior than me; we're expecting to hire someone with at least 10 more years of experience than me to basically be an architect-level individual contributor.
Edit: We have about 25 in-house programmers in my department and we're adding 3 more to the team I'm on.
jesus christ do you make video games or something
Worse; I work for lawyers
are and there are 25 of you?!
I work for one of the largest law firms in the world. I didn't even know that law firms needed programmers before I started this job. Also makes sense why we have trouble finding people, nobody even knows that we exist. If I told you my company name, unless you're tuned in to the world of lawyers, you wouldn't recognize it.
Fun fact, though: remember a year or two back when a bunch of lawyers descended on airports to help out with the travel ban? Those lawyers were using a system that I helped build for our Pro Bono practice to communicate with each other. I felt pretty good about that one.
Project I'm working on now is pretty much all functional programming everywhere, front-end and back-end. This is the first time I'm being exposed to a code base that's pretty much all done with the functional style, and one thing I've had some trouble figuring out is where shit actually comes from. I see the functions being passed around, but finding where the data passed to the functions comes from I've found difficult, even in Java. The Javascript/React/Redux side I've found it even harder since it seems like tons of magic is going on. This kind of style seems a lot harder to onboard onto unless you are very familiar with the code base.
I had the same issue with Rails, even though it's not strictly functional. It is super easy to create your own DSL that invisibly passes around and populates everything. It's both very clever and completely impenetrable.
well then the functional people just counter back and say that OOP style is impossible to grok
personally.... I've witnessed with my own eyes the best and worst of both worlds
and under the assumption that everyone is terrible by default, I'm way more content with the "worst" OOP code bases, as opposed to the worst functional code bases
in-all, these style wars are overrated. What is not overrated? Careful planning, documentation, and accountability of the developers in charge. If you have that, then mysteriously it doesn't seem to matter how you build your software! Huh, go figure
Project I'm working on now is pretty much all functional programming everywhere, front-end and back-end. This is the first time I'm being exposed to a code base that's pretty much all done with the functional style, and one thing I've had some trouble figuring out is where shit actually comes from. I see the functions being passed around, but finding where the data passed to the functions comes from I've found difficult, even in Java. The Javascript/React/Redux side I've found it even harder since it seems like tons of magic is going on. This kind of style seems a lot harder to onboard onto unless you are very familiar with the code base.
I had the same issue with Rails, even though it's not strictly functional. It is super easy to create your own DSL that invisibly passes around and populates everything. It's both very clever and completely impenetrable.
well then the functional people just counter back and say that OOP style is impossible to grok
personally.... I've witnessed with my own eyes the best and worst of both worlds
and under the assumption that everyone is terrible by default, I'm way more content with the "worst" OOP code bases, as opposed to the worst functional code bases
in-all, these style wars are overrated. What is not overrated? Careful planning, documentation, and accountability of the developers in charge. If you have that, then mysteriously it doesn't seem to matter how you build your software! Huh, go figure
I'm not ragging on functional programming, I actually really enjoy working in Ruby! Overall, I still prefer OOP, but I'm glad we're finally getting to the point that everyone else has all the set operations that Ruby spoiled me with. I include System.LINQ in basically every C# project these days.
If I need to create a system architecture diagram, are there any standard icons that would be recognizable to the broadest group of people?
Like if I need to represent web servers, databases, etc, and this diagram will be read by DBAs, data scientists, programmers, and a few others, are there any industry standards that I could use?
Ear3nd1lEärendil the Mariner, father of ElrondRegistered Userregular
There's no reason you can't have an OOP application that uses some functional/pure functions.
Functional programming definitely has its uses, but in my experience you can't cling so tightly to a single paradigm. It's so easy to get downright evangelical about our opinions sometimes, but it's important to remember that we need to be flexible depending on the problem we are trying to solve. I've seen too many people get hung up on the idea that their way is better than everyone else's that any dissenting opinion is an affront to their humanity.
If I need to create a system architecture diagram, are there any standard icons that would be recognizable to the broadest group of people?
Like if I need to represent web servers, databases, etc, and this diagram will be read by DBAs, data scientists, programmers, and a few others, are there any industry standards that I could use?
Some people use UML for this, but I've always felt it to be pretty stilted and limiting. I like my drawings to be freeform.
I tend to use boxes for servers or databases, lines with arrows to show how data is flowing between storage locations, and I use bubbles or clouds for web services that are going to be doing business logic. Arrows flow into and out of the boxes, and the arrows have labels with what I expect the data to be.
Most people that you show the diagram to will understand if you do it that way. If you want it actually formalized, UML probably can't hurt.
I'm having some trouble getting started on a topic as I don't know what it's called (or what I think it's called has other meanings blocking my search). If I'm going to write a program, and that program can detect other copies of the same program running on systems on the local network, without any server setup or whatnot, what is that called?
Also, is that something an Electron/Cordova app could even do?
You're probably looking for UDP broadcast, if you don't have an existing server anywhere that they can use to find one another.
(edit: googling for 'electron udp broadcast' / 'cordova udp broadcast', I get hits, but a lot of them seem to have had no changes for a few years now, and for this sort of os-level network stuff it feels like they might just have stopped working with OS updates, so you'd want to try them and see. Either way you'll most likely need a platform-specific helper of some sort)
If I need to create a system architecture diagram, are there any standard icons that would be recognizable to the broadest group of people?
Like if I need to represent web servers, databases, etc, and this diagram will be read by DBAs, data scientists, programmers, and a few others, are there any industry standards that I could use?
Some people use UML for this, but I've always felt it to be pretty stilted and limiting. I like my drawings to be freeform.
I tend to use boxes for servers or databases, lines with arrows to show how data is flowing between storage locations, and I use bubbles or clouds for web services that are going to be doing business logic. Arrows flow into and out of the boxes, and the arrows have labels with what I expect the data to be.
Most people that you show the diagram to will understand if you do it that way. If you want it actually formalized, UML probably can't hurt.
Vaguely related: I just discovered https://dbdiagram.io/ for knocking out quick DB diagrams using a simple markup language.
Echo on
+4
thatassemblyguyJanitor of Technical Debt.Registered Userregular
I love those kinds of tools, but I'm always suspicious of how much they keep stored in the servers. Like, if I had some stuff that I wanted to prototype for school? No problem using an app like that. If I wanted to prototype stuff for work-work? I don't think I could use it since I inherently lose control of the content.
put a modified form of this in the tech thread last night
yesterday I thought I would add some functionality to what I shall charitably call my 'dev console', and considered the problem of taking an int or float variable and converting it to a string for my 'renderText' function. I dug around a little and found the sprintf function. I wrote up a small source file and then intuited the line
char test[] = { sprintf(test, "%d", a) };
when I then invoked
printf("%s", test);
it printed out [an unprintable character], which was disappointing. But it compiled and so I became interested in pursuing it. Where did this character come from? Is it genuine garbage, or just something unexpected?
I got someone to look at it, and he was just amazed it compiled. We ran some more tests, but it insisted on putting out that [unprintable]. I put it away for a couple hours, then swung back around: looking sprintf up on a reference site, I realized I was being very dumb. For whatever reason I had been assuming it would return the string. In hindsight this makes no sense and I don't know why I assumed that.
so I ran some more tests with that information. I gave a the value of 1234567, and it did exactly what I expected: when it hit the printf statement, the system BEL sounded. And so I added a space in the "%d", and gave printf 'test+1', which began working consistently.
So on the one hand I'm happy that I solved the mystery of [unprintable]. For a little while I was elated, geeking out, over the "how did that work?"-ness of it. On the other I'm frustrated because it is clearly not OK. When I came up with the line it felt like I was finally putting the pieces together, reaching a genuine milestone. I mean, it felt like something I'd see in the K&R book: wizardry.
(I did drop it into the larger program, out of curiosity. As you no doubt have guessed, it does strange and terrible things and cannot be relied upon. But I'm still giggling over it.)
put a modified form of this in the tech thread last night
yesterday I thought I would add some functionality to what I shall charitably call my 'dev console', and considered the problem of taking an int or float variable and converting it to a string for my 'renderText' function. I dug around a little and found the sprintf function. I wrote up a small source file and then intuited the line
char test[] = { sprintf(test, "%d", a) };
when I then invoked
printf("%s", test);
it printed out [an unprintable character], which was disappointing. But it compiled and so I became interested in pursuing it. Where did this character come from? Is it genuine garbage, or just something unexpected?
I got someone to look at it, and he was just amazed it compiled. We ran some more tests, but it insisted on putting out that [unprintable]. I put it away for a couple hours, then swung back around: looking sprintf up on a reference site, I realized I was being very dumb. For whatever reason I had been assuming it would return the string. In hindsight this makes no sense and I don't know why I assumed that.
so I ran some more tests with that information. I gave a the value of 1234567, and it did exactly what I expected: when it hit the printf statement, the system BEL sounded. And so I added a space in the "%d", and gave printf 'test+1', which began working consistently.
So on the one hand I'm happy that I solved the mystery of [unprintable]. For a little while I was elated, geeking out, over the "how did that work?"-ness of it. On the other I'm frustrated because it is clearly not OK. When I came up with the line it felt like I was finally putting the pieces together, reaching a genuine milestone. I mean, it felt like something I'd see in the K&R book: wizardry.
(I did drop it into the larger program, out of curiosity. As you no doubt have guessed, it does strange and terrible things and cannot be relied upon. But I'm still giggling over it.)
Hey, the fun thing you've been playing with is actually an entire class of vulnerabilities in C:
It's important to remember char aren't anything special. They're just essentially a short byte or some other form of integer. A string is just a collection of these ints.
Also this will hit into some shit you'll need to know for sockets/streams/buffers. You can use control characters that aren't normal ASCII like that to use for delimiters in your buffer for typed ASCII. You just need to make sure you error correct because funny things start happening if people dump non-ascii into the buffer.
Just keep in mind when you move to C++, you can't use std:string for a lot of this stuff because of the automatic control characters and such. Also once you move into unicode you can no longer use all this fun quick math because width of char can/will change.
bowen on
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
String processing in c is real flaky in general. If you run into a scenario where you're doing heavy text processing, switch to a language more suited to it (IMO anyway).
Just keep in mind when you move to C++, you can't use std:string for a lot of this stuff because of the automatic control characters and such. Also once you move into unicode you can no longer use all this fun quick math because width of char can/will change.
I like that Go is UTF-8 by default. A string with one Unicode character can be several bytes in actual length if treated as bytes, and it has the `rune` type for actual individual UTF-8 characters.
String processing in c is real flaky in general. If you run into a scenario where you're doing heavy text processing, switch to a language more suited to it (IMO anyway).
Yeah not for text processing necessarily. But for handling something like base64 in a circular buffer or working directly with a serial stream or something.
not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
put a modified form of this in the tech thread last night
yesterday I thought I would add some functionality to what I shall charitably call my 'dev console', and considered the problem of taking an int or float variable and converting it to a string for my 'renderText' function. I dug around a little and found the sprintf function. I wrote up a small source file and then intuited the line
char test[] = { sprintf(test, "%d", a) };
when I then invoked
printf("%s", test);
it printed out [an unprintable character], which was disappointing. But it compiled and so I became interested in pursuing it. Where did this character come from? Is it genuine garbage, or just something unexpected?
I got someone to look at it, and he was just amazed it compiled. We ran some more tests, but it insisted on putting out that [unprintable]. I put it away for a couple hours, then swung back around: looking sprintf up on a reference site, I realized I was being very dumb. For whatever reason I had been assuming it would return the string. In hindsight this makes no sense and I don't know why I assumed that.
so I ran some more tests with that information. I gave a the value of 1234567, and it did exactly what I expected: when it hit the printf statement, the system BEL sounded. And so I added a space in the "%d", and gave printf 'test+1', which began working consistently.
So on the one hand I'm happy that I solved the mystery of [unprintable]. For a little while I was elated, geeking out, over the "how did that work?"-ness of it. On the other I'm frustrated because it is clearly not OK. When I came up with the line it felt like I was finally putting the pieces together, reaching a genuine milestone. I mean, it felt like something I'd see in the K&R book: wizardry.
(I did drop it into the larger program, out of curiosity. As you no doubt have guessed, it does strange and terrible things and cannot be relied upon. But I'm still giggling over it.)
Uhm. I might be a bit off-base here, but that code is running a chill up my spine. As near as I can tell, there are a couple factors interacting to produce the weird results. First, the array "test" is being created with a size of a single element (there is only one entry in the set provided within { .. } ). Since this is being done on the stack, the size has to be known at compile time, and the only information the compiler has at that point is that the array needs to hold a single element, in this case the return value of sprintf(...). Second, sprintf(..) returns an integer, the number of characters printed to the char* passed in as the first argument. Any integer will result in a buffer overrun here (although it may be simply truncated to sizeof(char)), as again, the buffer "test" is practically char[1].
On top of this, the string created by the method is written into "test", will always result in a buffer overrun as well (because even a single digit will also require a null terminator, for a total of 2 chars). So, what's happening is a string is being written into space you don't formally own. This isn't resulting in a segfault because it's happening on the stack (and the stars have aligned for you!). The reason printf(..) is working on "test+1" is that the char pointed to by "test" is the return value of sprintf (an integer), followed by the rest of the written string. You can verify this for yourself by declaring a few more variables on the stack after the posted lines and give them some values (I seem to recall that C guarantees the ordering of variables declared in a block is the same as their ordering on the stack). Then try printing the string again. Alternately, fire up the debugger and just have a look at the addresses in question.
The idiomatic way to fix this would be to preallocate a buffer of predetermined maximal size (for example, char[32] or char[64] or whatever) and subsequently pass that buffer into sprintf(..). You still risk overruns (if the string ends up being longer) but.. eh. It's C. Whatcha gonna do? Their may be a comparable method to sprintf(..) which takes a max-length argument, but I haven't worked with C for a good while so I can't help you there, sorry! Anyway, hope this makes the strange behaviour of that code a little clearer :]
put a modified form of this in the tech thread last night
yesterday I thought I would add some functionality to what I shall charitably call my 'dev console', and considered the problem of taking an int or float variable and converting it to a string for my 'renderText' function. I dug around a little and found the sprintf function. I wrote up a small source file and then intuited the line
char test[] = { sprintf(test, "%d", a) };
when I then invoked
printf("%s", test);
it printed out [an unprintable character], which was disappointing. But it compiled and so I became interested in pursuing it. Where did this character come from? Is it genuine garbage, or just something unexpected?
I got someone to look at it, and he was just amazed it compiled. We ran some more tests, but it insisted on putting out that [unprintable]. I put it away for a couple hours, then swung back around: looking sprintf up on a reference site, I realized I was being very dumb. For whatever reason I had been assuming it would return the string. In hindsight this makes no sense and I don't know why I assumed that.
so I ran some more tests with that information. I gave a the value of 1234567, and it did exactly what I expected: when it hit the printf statement, the system BEL sounded. And so I added a space in the "%d", and gave printf 'test+1', which began working consistently.
So on the one hand I'm happy that I solved the mystery of [unprintable]. For a little while I was elated, geeking out, over the "how did that work?"-ness of it. On the other I'm frustrated because it is clearly not OK. When I came up with the line it felt like I was finally putting the pieces together, reaching a genuine milestone. I mean, it felt like something I'd see in the K&R book: wizardry.
(I did drop it into the larger program, out of curiosity. As you no doubt have guessed, it does strange and terrible things and cannot be relied upon. But I'm still giggling over it.)
Uhm. I might be a bit off-base here, but that code is running a chill up my spine. As near as I can tell, there are a couple factors interacting to produce the weird results. First, the array "test" is being created with a size of a single element (there is only one entry in the set provided within { .. } ). Since this is being done on the stack, the size has to be known at compile time, and the only information the compiler has at that point is that the array needs to hold a single element, in this case the return value of sprintf(...). Second, sprintf(..) returns an integer, the number of characters printed to the char* passed in as the first argument. Any integer will result in a buffer overrun here (although it may be simply truncated to sizeof(char)), as again, the buffer "test" is practically char[1].
On top of this, the string created by the method is written into "test", will always result in a buffer overrun as well (because even a single digit will also require a null terminator, for a total of 2 chars). So, what's happening is a string is being written into space you don't formally own. This isn't resulting in a segfault because it's happening on the stack (and the stars have aligned for you!). The reason printf(..) is working on "test+1" is that the char pointed to by "test" is the return value of sprintf (an integer), followed by the rest of the written string. You can verify this for yourself by declaring a few more variables on the stack after the posted lines and give them some values (I seem to recall that C guarantees the ordering of variables declared in a block is the same as their ordering on the stack). Then try printing the string again. Alternately, fire up the debugger and just have a look at the addresses in question.
The idiomatic way to fix this would be to preallocate a buffer of predetermined maximal size (for example, char[32] or char[64] or whatever) and subsequently pass that buffer into sprintf(..). You still risk overruns (if the string ends up being longer) but.. eh. It's C. Whatcha gonna do? Their may be a comparable method to sprintf(..) which takes a max-length argument, but I haven't worked with C for a good while so I can't help you there, sorry! Anyway, hope this makes the strange behaviour of that code a little clearer :]
For C, the size-limited version is (usually) the same name with an "n" put in the name somewhere. In this case, snprintf.
But if you want to do dynamic allocation, asprintf is 1000x better than...this, and about 10x better than malloc+snprintf.
Got some new toys in the mail - a bunch of Wemos D1 Mini cards.
It took me 15 minutes of dabbling to do some cool stuff using this library. If there are no wifi settings stored, it spins up a wifi access point and a DNS server, and then runs a captive portal. Connect to its wifi network, use the portal to configure the actual wifi settings, and it stores them, reboots, and then hands control back to the rest of your code. Super handy way to configure wifi without having to compile and upload a new binary with your wifi settings hardcoded.
Some more dabbling to add a MQTT library and now it publishes its uptime in milliseconds to a remote MQTT broker.
had a production hiccup today that I *thought* was one that I hadn't solved yet
i was all mentally prepared to dive in, i started diagramming shit, swearing that I've painted myself into a corner design-wise, etc, etc
nevermind it self-healed. ok great. guess ill put away all these diagrams... cancel plans to order a new white board... maybe go back into my code to try to remember exactly how past-me already solved this
0
Monkey Ball WarriorA collection of mediocre hatsSeattle, WARegistered Userregular
had a production hiccup today that I *thought* was one that I hadn't solved yet
i was all mentally prepared to dive in, i started diagramming shit, swearing that I've painted myself into a corner design-wise, etc, etc
nevermind it self-healed. ok great. guess ill put away all these diagrams... cancel plans to order a new white board... maybe go back into my code to try to remember exactly how past-me already solved this
This kind of situation is unique in that it can instill in me both relief and dread at the same time. Like, I'm happy the problem went away, but now I may never discover the root cause.
"I resent the entire notion of a body as an ante and then raise you a generalized dissatisfaction with physicality itself" -- Tycho
had a production hiccup today that I *thought* was one that I hadn't solved yet
i was all mentally prepared to dive in, i started diagramming shit, swearing that I've painted myself into a corner design-wise, etc, etc
nevermind it self-healed. ok great. guess ill put away all these diagrams... cancel plans to order a new white board... maybe go back into my code to try to remember exactly how past-me already solved this
This kind of situation is unique in that it can instill in me both relief and dread at the same time. Like, I'm happy the problem went away, but now I may never discover the root cause.
Well, it's been awhile because I've been crazy busy working on a lot of cool stuff. My employer is looking to hire - primarily a Ruby/Rails dev or someone with solid similar experience willing to get up to speed on Rails stuff. Longer term, expect opportunities to do Django, possibly PHP if you really want to, etc.
Remote work may be a possibility, but relocation to Richmond, Va is definitely preferred. If you're interested or know anyone who might be, send me a pm and I will be happy to provide more details on the company, immediate needs, etc.
Also, what's up guys? I'm pretending to be a rails dev lately (I'm a Django guy, so it's painful) and doing a lot of interesting DevOps related stuff. Doing a ton of fun Go stuff on the side and have a personal project (and potential future money maker) close to an alpha level usable state for my personal needs and am super close to jumping into some Xamarin stuff for the client/mobile side of that.
I'm also sampling liquors tonight. Tried some Kirschwasser I picked up, Applejack (locally distilled), and am comparing to my standard go to bourbon to a semi-local in-state one. That's unrelated, but you know, it's important stuff.
In C++, Is there a simpler way to parse a formula string (end-of-line terminated) into an AST that has access to a hash map containing variables than using flex & bison?
I mean, what I want to do is very close to the bison example calc++ (except with more operators and functions), except I want to be able to do a loop of evaluating the expression, saving the result in a vector, then updating the values of the variables, then evaluating again, and so on until the variables values have reached a certain value.
Goddamn it, it shouldn't be this hard! I did it in Java years ago, and I don't want to port my Java code to C++ (because I already tried and failed)...
Also, if I do succeed in making this serial expression evaluator library in C++, how do I integrate it into C# code? I was thinking of making a website where people put formulas in the front-end and get a file containing the result of the calculations, and I'd like to error-check/sanitize people's entries entirely client-side. I was thinking of doing it with ASP.NET. I tried doing it with a MEAN stack, but it was super-jank. Also, I don't want to use javascript on the server side of things.
Or should I just cave in and use bisonflex to C and do my website using a Python framework?
Posts
I'd give my vote for Alison Green of Ask A Manager, myself - there's a lot of good advice there on how to manage and lead. (As well as some impressive stories of bug fuck crazy.)
Feel free to PM me if you have more detailed questions.
http://www.gamasutra.com/blogs/PaulTozour/20150126/235024/The_Game_Outcomes_Project_Part_5_What_Great_Teams_Do.php
are and there are 25 of you?!
I had the same issue with Rails, even though it's not strictly functional. It is super easy to create your own DSL that invisibly passes around and populates everything. It's both very clever and completely impenetrable.
I work for one of the largest law firms in the world. I didn't even know that law firms needed programmers before I started this job. Also makes sense why we have trouble finding people, nobody even knows that we exist. If I told you my company name, unless you're tuned in to the world of lawyers, you wouldn't recognize it.
Fun fact, though: remember a year or two back when a bunch of lawyers descended on airports to help out with the travel ban? Those lawyers were using a system that I helped build for our Pro Bono practice to communicate with each other. I felt pretty good about that one.
well then the functional people just counter back and say that OOP style is impossible to grok
personally.... I've witnessed with my own eyes the best and worst of both worlds
and under the assumption that everyone is terrible by default, I'm way more content with the "worst" OOP code bases, as opposed to the worst functional code bases
in-all, these style wars are overrated. What is not overrated? Careful planning, documentation, and accountability of the developers in charge. If you have that, then mysteriously it doesn't seem to matter how you build your software! Huh, go figure
I'm not ragging on functional programming, I actually really enjoy working in Ruby! Overall, I still prefer OOP, but I'm glad we're finally getting to the point that everyone else has all the set operations that Ruby spoiled me with. I include System.LINQ in basically every C# project these days.
Like if I need to represent web servers, databases, etc, and this diagram will be read by DBAs, data scientists, programmers, and a few others, are there any industry standards that I could use?
Functional programming definitely has its uses, but in my experience you can't cling so tightly to a single paradigm. It's so easy to get downright evangelical about our opinions sometimes, but it's important to remember that we need to be flexible depending on the problem we are trying to solve. I've seen too many people get hung up on the idea that their way is better than everyone else's that any dissenting opinion is an affront to their humanity.
Some people use UML for this, but I've always felt it to be pretty stilted and limiting. I like my drawings to be freeform.
I tend to use boxes for servers or databases, lines with arrows to show how data is flowing between storage locations, and I use bubbles or clouds for web services that are going to be doing business logic. Arrows flow into and out of the boxes, and the arrows have labels with what I expect the data to be.
Most people that you show the diagram to will understand if you do it that way. If you want it actually formalized, UML probably can't hurt.
Also, is that something an Electron/Cordova app could even do?
(edit: googling for 'electron udp broadcast' / 'cordova udp broadcast', I get hits, but a lot of them seem to have had no changes for a few years now, and for this sort of os-level network stuff it feels like they might just have stopped working with OS updates, so you'd want to try them and see. Either way you'll most likely need a platform-specific helper of some sort)
http://plantuml.com/
It's a layer over graphviz but its great for most enterprise "draw a system" stuff.
Pretty neat
I like how a low level of trials (the legend in the first graph) shows that there's overhead associated with creating 6+ threads
yesterday I thought I would add some functionality to what I shall charitably call my 'dev console', and considered the problem of taking an int or float variable and converting it to a string for my 'renderText' function. I dug around a little and found the sprintf function. I wrote up a small source file and then intuited the line
char test[] = { sprintf(test, "%d", a) };
when I then invoked
printf("%s", test);
it printed out [an unprintable character], which was disappointing. But it compiled and so I became interested in pursuing it. Where did this character come from? Is it genuine garbage, or just something unexpected?
I got someone to look at it, and he was just amazed it compiled. We ran some more tests, but it insisted on putting out that [unprintable]. I put it away for a couple hours, then swung back around: looking sprintf up on a reference site, I realized I was being very dumb. For whatever reason I had been assuming it would return the string. In hindsight this makes no sense and I don't know why I assumed that.
so I ran some more tests with that information. I gave a the value of 1234567, and it did exactly what I expected: when it hit the printf statement, the system BEL sounded. And so I added a space in the "%d", and gave printf 'test+1', which began working consistently.
So on the one hand I'm happy that I solved the mystery of [unprintable]. For a little while I was elated, geeking out, over the "how did that work?"-ness of it. On the other I'm frustrated because it is clearly not OK. When I came up with the line it felt like I was finally putting the pieces together, reaching a genuine milestone. I mean, it felt like something I'd see in the K&R book: wizardry.
(I did drop it into the larger program, out of curiosity. As you no doubt have guessed, it does strange and terrible things and cannot be relied upon. But I'm still giggling over it.)
Hey, the fun thing you've been playing with is actually an entire class of vulnerabilities in C:
https://en.wikipedia.org/wiki/Uncontrolled_format_string
Also this will hit into some shit you'll need to know for sockets/streams/buffers. You can use control characters that aren't normal ASCII like that to use for delimiters in your buffer for typed ASCII. You just need to make sure you error correct because funny things start happening if people dump non-ascii into the buffer.
Just keep in mind when you move to C++, you can't use std:string for a lot of this stuff because of the automatic control characters and such. Also once you move into unicode you can no longer use all this fun quick math because width of char can/will change.
I like that Go is UTF-8 by default. A string with one Unicode character can be several bytes in actual length if treated as bytes, and it has the `rune` type for actual individual UTF-8 characters.
Yeah not for text processing necessarily. But for handling something like base64 in a circular buffer or working directly with a serial stream or something.
8-)
and that's when I stabbed them your honor
I, too, remember 1990
Uhm. I might be a bit off-base here, but that code is running a chill up my spine. As near as I can tell, there are a couple factors interacting to produce the weird results. First, the array "test" is being created with a size of a single element (there is only one entry in the set provided within { .. } ). Since this is being done on the stack, the size has to be known at compile time, and the only information the compiler has at that point is that the array needs to hold a single element, in this case the return value of sprintf(...). Second, sprintf(..) returns an integer, the number of characters printed to the char* passed in as the first argument. Any integer will result in a buffer overrun here (although it may be simply truncated to sizeof(char)), as again, the buffer "test" is practically char[1].
On top of this, the string created by the method is written into "test", will always result in a buffer overrun as well (because even a single digit will also require a null terminator, for a total of 2 chars). So, what's happening is a string is being written into space you don't formally own. This isn't resulting in a segfault because it's happening on the stack (and the stars have aligned for you!). The reason printf(..) is working on "test+1" is that the char pointed to by "test" is the return value of sprintf (an integer), followed by the rest of the written string. You can verify this for yourself by declaring a few more variables on the stack after the posted lines and give them some values (I seem to recall that C guarantees the ordering of variables declared in a block is the same as their ordering on the stack). Then try printing the string again. Alternately, fire up the debugger and just have a look at the addresses in question.
The idiomatic way to fix this would be to preallocate a buffer of predetermined maximal size (for example, char[32] or char[64] or whatever) and subsequently pass that buffer into sprintf(..). You still risk overruns (if the string ends up being longer) but.. eh. It's C. Whatcha gonna do? Their may be a comparable method to sprintf(..) which takes a max-length argument, but I haven't worked with C for a good while so I can't help you there, sorry! Anyway, hope this makes the strange behaviour of that code a little clearer :]
For C, the size-limited version is (usually) the same name with an "n" put in the name somewhere. In this case, snprintf.
But if you want to do dynamic allocation, asprintf is 1000x better than...this, and about 10x better than malloc+snprintf.
It took me 15 minutes of dabbling to do some cool stuff using this library. If there are no wifi settings stored, it spins up a wifi access point and a DNS server, and then runs a captive portal. Connect to its wifi network, use the portal to configure the actual wifi settings, and it stores them, reboots, and then hands control back to the rest of your code. Super handy way to configure wifi without having to compile and upload a new binary with your wifi settings hardcoded.
Some more dabbling to add a MQTT library and now it publishes its uptime in milliseconds to a remote MQTT broker.
i was all mentally prepared to dive in, i started diagramming shit, swearing that I've painted myself into a corner design-wise, etc, etc
nevermind it self-healed. ok great. guess ill put away all these diagrams... cancel plans to order a new white board... maybe go back into my code to try to remember exactly how past-me already solved this
This kind of situation is unique in that it can instill in me both relief and dread at the same time. Like, I'm happy the problem went away, but now I may never discover the root cause.
Gorram solar flares.
Remote work may be a possibility, but relocation to Richmond, Va is definitely preferred. If you're interested or know anyone who might be, send me a pm and I will be happy to provide more details on the company, immediate needs, etc.
Also, what's up guys? I'm pretending to be a rails dev lately (I'm a Django guy, so it's painful) and doing a lot of interesting DevOps related stuff. Doing a ton of fun Go stuff on the side and have a personal project (and potential future money maker) close to an alpha level usable state for my personal needs and am super close to jumping into some Xamarin stuff for the client/mobile side of that.
I'm also sampling liquors tonight. Tried some Kirschwasser I picked up, Applejack (locally distilled), and am comparing to my standard go to bourbon to a semi-local in-state one. That's unrelated, but you know, it's important stuff.
I mean, what I want to do is very close to the bison example calc++ (except with more operators and functions), except I want to be able to do a loop of evaluating the expression, saving the result in a vector, then updating the values of the variables, then evaluating again, and so on until the variables values have reached a certain value.
Goddamn it, it shouldn't be this hard! I did it in Java years ago, and I don't want to port my Java code to C++ (because I already tried and failed)...
Also, if I do succeed in making this serial expression evaluator library in C++, how do I integrate it into C# code? I was thinking of making a website where people put formulas in the front-end and get a file containing the result of the calculations, and I'd like to error-check/sanitize people's entries entirely client-side. I was thinking of doing it with ASP.NET. I tried doing it with a MEAN stack, but it was super-jank. Also, I don't want to use javascript on the server side of things.
Or should I just cave in and use bisonflex to C and do my website using a Python framework?