[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / hobby / tech / edu / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / wiki / git ] [ GET / ref / marx / booru / zine ]

/tech/ - Technology

"Technology reveals the active relation of man to nature" - Karl Marx
Password (For file deletion.)
Required: 2 + 2 =

Join our Matrix Chat <=> IRC: #leftypol on Rizon

| Catalog | Home

File: 1671697504344.jpg (25.14 KB, 320x240, v1kky.jpg)


Reminder that Vicky is canonically a computer hacker.






Episode Scout's Honor (pilot season, ep 8)
>"And if you all play your cards right, I just might teach'a how to hack into the school's computer system, just like I did."



Anyone else confused about them? I thought "senior" would mean someone with like a decade of work experience in the field, but I keep seeing people get promoted to it after only 2-3 years of working. Is there any meaning behind the titles or is it seen as just some extra benefit of "prestige" companies throw around to attract/retain people?


Also it seem the best way to "climb the corporate ladder" is to switch jobs every other year. I've been working at the same place for four years now, does that mean that I will be stuck there until civilisation collapses?


Quite possibly yes, the other alternative is to play hardball with the managers and try to get promoted.


well yeah and it's not too surprising
if there's a job you need done you can find more people in the labor pool with a wider variety of skillsets than you can find looking inside your company, compared to the extra work involved in promoting an existing employee. there's really no incentive to reward loyalty to the company so why would they.


job titles are meaningless /thread


lol my senior engineer was late 22s, ex-contractor in [asian country].
ageism is dumb.

File: 1659415572224.gif (1.64 MB, 659x609, 1445909508132.gif)


The STEM student -> defense contractor pipeline is a serious issue in this country. In many places and some fields, a greater majority of available jobs are like defense contractors. That shit is bleak and horrifying man. They'll be like "assist in creating autonomous ai for delivering payloads towards targets" and knowing exactly what they mean lol. I took a robotics software class where we discussed "efficiently tracking multiple moving targets with distinct trajectories" as like an academic problem and briefly wondered if I hadn't chosen the wrong field of study.
2 posts and 1 image reply omitted. Click reply to view.


People would probably not care about some random queer failsons fifteen years doing IT work for the weapons manufacturer if they didn’t position themselves as some poor put upon marginalized personality who jockeys for social power on Twitter. Pretty important context!
A fairly common issue within queer “spaces” (and if we are being honest plenty more ‘marginalized’ leftism-as-aesthetics social groups) is that people aggravate the most for social power & woe-is-me clout consistently end up, uh, being full of bullshit affectations.


most surreal shit is that the asshole who tried to cancel and dogpile the attack helicopter book author works on actual attack helicopters



just do a new 9/11


These people should be socially exiled from online spaces.

File: 1669933947460.png (119.62 KB, 1903x834, mom.png)


Why does /tech/ never talk about MIDDLEWARE and integration, despite it being such a huge part of enterprise stuff?

I mean JMS/Kafka/Camel/Artemis/ActiveMQ/RabbitMQ, ESBs, etc.

Ques, distributed streaming, event sourcing brokers (kafka), etc. not to mention cloud pubsub systems.
6 posts and 2 image replies omitted. Click reply to view.


Middleware is an overloaded term but in the sense that OP is using it, you do not program middleware. You use them. TBH most programming is basically configuration and connecting shit. The hardest part is modelling data and organizing your files so you don't make a mess. Maybe testing can be hard too. There are also clients for connecting to middleware that are well tested, so it's not particularly hard to use it.


>you do not program middleware. You use them.
yeah but knowing how and more importantly why to use them is also something that requires knowledge since you have to (or should) learn about integration patterns which a lot of people don't know.


There's so much shit to learn I don't know what to do


depends where you are. These things are important for senior software devs. You can get away with not knowing absolutely anything about this as a junior and mid level.


junior yes, mid level no. Even after 3-5 YOE they will start asking about at least some of this stuff. Luckily we can learn all this shit for free online now (or 98% of it).


THIS IS A STUDY GROUP FOR Oracle Certified Professional: Java SE 17 Developer 1Z0-829

My motivation in studying this is that my employer is soft requiring me to have a Java cert to be promoted, and this is the latest one. I bought a hardcopy of the book+practice exams and will be posting quotations from it here.

Anyone who wants to either learn modern Java (17+) or study for that exam feel free to follow along.
10 posts omitted. Click reply to view.


Which of the following expressions compile without error? (Choose all that apply.)

A) int monday = 3 + 2.0;
B) double tuesday = 5_6L;
C) boolean wednesday = 1 > 2 ? !true;
D) short thursday = (short)Integer.MAX_VALUE;
E) long friday = 8.0L;
F) var saturday = 2_.0;
G) None of the above

B, D. Option A does not compile, as the expression 3 + 2.0 is evaluated as a double, and a double requires an explicit cast to be assigned to an int. Option B compiles without issue, as a long value can be implicitly cast to a double. Option C does not compile because the ternary operator (? :) is missing a colon (:), followed by a second expression. Option D is correct. Even though the int value is larger than a short, it is explicitly cast to a short, which means the value will wrap around to fit in a short. Option E is incorrect, as you cannot use a decimal (.) with the long (L) postfix. Finally, option F is incorrect, as an underscore cannot be used next to a decimal point.

What is the result of executing the following application?

Post too long. Click here to view the full text.


update: been slogging thru this book and boy is it boring. A ton of the questions seem like they are halfway obfuscated java code that you have to determine by looking at them, if they compile and what they output.

It's like turning yourself into a human java compiler


Those are the best, it really is the most fundamental skill.


shouldn’t you familiarize with a compiler if you’re learning a system programming language


I don't think Java is usually considered a system programming language.

File: 1669310020313.gif (149.47 KB, 220x214, scared.gif)


> Amazon makes custom CPU (AWS Graviton)
> Google makes cusom CPU (Google Tensor)
> Microsoft makes custom CPU (SQ3)
> Apple makes custom CPU (M1)
> many other big tech companies announce that they are planning custom chips too
Am I the only one worried about this trend?
22 posts and 1 image reply omitted. Click reply to view.


No, I mean Amazon, Google, Microsoft, etc. all making their fully controlled stacks.


I think most of us ARM wise are running Raspberry Pi or jail broken Mac/Android. The fact Microsoft and Intel are pushing hard with "trusted keys" in UEFI is far more scary as it might mean Linux users in the future might have to jailbreak x86 machines for them to run anything but Windows.


>China can't make their own chips to the standard of the west or taiwan due to sanctions/trade embargos
No the sanctions didn't work, China is catching up in chip-manufacturing technology. They used to be 3 or 4 nodes behind and now they're just one node behind. The west in on the 5nm node and China is on 7nm. It won't be long until they reach parity. If anything the sanctions had the result of speeding up development in China's high-tech sector.


we'll see. remember semiconductor manufacturing is very specialized, even the west or taiwan would be crippled if they couldnt get UV machines from that one dutch firm.

i'm skeptical China can catch up that quickly, maybe in 20 years or more though.


Why would they do this? I agree with the business concerns. Here are some business needs.

> Amazon makes custom CPU (AWS Graviton)

Challenge: Managing like 20% of world computing power is hard?

> Google makes cusom CPU (Google Tensor)

We want to do AI. This is hard and slow. We need no GPU encoding junk.

> Microsoft makes custom CPU (SQ3)

We want to sell a laptop to people that like long battery life. To get long battery life you need ARM. What if we customized it?

> Apple makes custom CPU (M1)

Customers want lots of random accelerators. Are the accelerators on the CPU or GPU die?

File: 1667609095903.png (5.91 MB, 3840x2160, av1.png)


Is this gonna be the replacement for both VP9 and H.265/HEVC?
Royalty free
Better quality and more efficient compression than both, however apparently slow as hell to encode
It can be multiplexed into .mkv, .webm and .mp4
It's currently being gradually, tentatively rolled out on pretty much every major streaming platform you can think of to replace VP9

4K sample: https://www.elecard.com/storage/video/Stream1_AV1_4K_8.5mbps.webm

https://aomedia.googlesource.com/aom/ Reference implementation written in C, supported by FFmpeg
https://gitlab.com/AOMediaCodec/SVT-AV1 BSD-licensed implementation started by Intel and Netflix and targeted to be flexible for different applications, supported by FFmpeg
https://github.com/xiph/rav1e BSD-licensed lightweight encoder written in Rust and designed for speed, supported by FFmpeg
https://code.videolan.org/videolan/dav1d BSD-licensed lightweight decoder written in C, supported by Handbrake
https://chromium.googlesource.com/codecs/libgav1/ Apache-licensed decoder developed by Google
2 posts and 3 image replies omitted. Click reply to view.



I tried encoding a 10-second clip with this on my potato laptop, it lagged like hell while making the CPU fan go crazy


File: 1668767676323.png (2.37 MB, 1920x1040, 3.png)

I like the implementation of grain synthesis in av1 as a method of preserving film grain at low bitrates which avoids the pitfalls of other methods such as oversharpening (psy RD in x264) and the inaccuracy of post-processing filters/noise generators. See picrel for an example screenshot taken from a file encoded at only 3 mpbs.


Encoding is definitely slow with AV1, I think it will work great once hardware encoders are common. I believe the h265 encoder is also slow (idk how it compares to av1), and h264 will still be a lot faster than both of them.


File: 1670149892986.png (17.89 KB, 826x226, ClipboardImage.png)

Google's Tensor SoC which they use in the Pixel 6 has special hardware acceleration specifically for their own special AV1 decoder
Not sure if it really matters when you can't reliably encode high quality and fast yet

File: 1668114543983.png (20.54 KB, 615x425, ClipboardImage.png)


Wanting to learn Lisp, are there any good manuals for learning to use it that people would recommend using?





the book "Land of Lisp: Learn to Program in Lisp, One Game at a Time!" was in my symbolic programming class in grad school. Pretty good and accessible for a beginner to LISP

File: 1651706559875.jpg (620.68 KB, 3000x2076, E14_eumWYAwu9xn.jpg)


this might be off topic but i've seen alot of older "freedom of speech anarchists" from 90s/00s suddenly turn out to be libertarians (if that makes any since). basically people back then who were into freedom of information and computer security are now crypto loving ancaps who want to own more shit then ever. there seems to be alot of them in the /cy/ scene. anyone else seen this?
11 posts and 1 image reply omitted. Click reply to view.


It's not the same people just lolberts infiltrated internet communities during times of increased centralized corporatization of the internet trying to get on the same bandwagon while the anarchists were stuck not having enough time to do anything because increasing demands of their jobs while lolberts usually dont have jobs.


nah insufferable libertarianism has always been part of US tech culture. (at least, the post-Altair part anyone actually cares about)
"The Californian Ideology" wasn't written for nothing.


Tech optimism died and all that was left was the greasy salesmen


File: 1669270746897.jpg (655.45 KB, 2048x1360, CyberiaScreencap.jpg)

You sure?


this is fucking nonsense

File: 1667980731149.jpg (147.78 KB, 1080x1080, linux.jpg)

 No.17479[Reply][Last 50 Posts]

Why is Linux so user-hostile? It feels like it was specifically designed by programmers so that they can feel superior to us common folks.
133 posts and 17 image replies omitted. Click reply to view.


You at the normie that thinks only kids pirate.


>That's what the end user wants - a smug programmer telling them that the computer just werx for them.
not a programmer, i literally grew up on windows xp and 7 but i switched to linux because 10 was shit and 11 is supposedly even worse. i'm about as computer literate as every other zoomie
>just works
is not an expression of my smug elitism but rather an apt description for the linux user experience in 2022
>(it does not, in fact, work for them.)
it literally does. try any distro that markets itself towards normies, like ubuntu, mint, manjaro etc. literally all of them will run without issue, and will usually run faster than windows on most hardware. windows 10 somehow managed to take up 4.8gb of my ram idle, while endeavour os with xfce4 takes only 1gb, although i'm pretty sure it took 800mb by default. i don't know anyone who would rather buy an expensive updated version of a laptop than stick with the one they already own because the os isn't an unoptimized pile of shit
here's a list of the things most people need from a computer:
>web browser
>multimedia player
>office software
linux does all of these things well
it's not a more practical system just because of the monopoly and support for software that only professionals need
you can literally not mess with anything if you don't want to
Post too long. Click here to view the full text.


not what i'm getting at, but it's more fun to keep you out of the loop.

Look, you're going down the wrong avenue. We can sit here and litigate about whether it's better to learn scripting and fix some UI element because the guy who wrote the DE was a cunt and didn't just give you a menu option, or whether it's better to install a dodgy start menu replacement on Windows, but it doesn't count for much in the big picture.
The very fact that one has to switch to gnu/linux means it doesn't work for them. We can sit here and litigate the details of individual software choice all day, it's still not going to shift marketshares a single percentage point. Realistically, there are 3 options: You go full communist and smash Microsoft (my option), you try and make linux actually competitive in the market (arguably the Ubuntu option), or you can give up and admit that you're content with most people using Windows so long as you get to feel cool for using AmigaOS. Microsoft's dominance of the industry is a function of monopoly-capitalism, not a function of insufficient evangelism on the part of gnu/linux users. One would expect that this would come intuitively to anyone on /leftypol/, yet you'd half suspect that /tech/ was transplanted directly from /g/.

you might not mock people for their OS choices or sit feeling smugly superior to them, but others ITT clearly do. (because, as i'd never tire of pointing out, this kind of argument isn't really about the merits of computer systems, it's about ingroups and outgroups, friends and enemies, yada yada yada. not up to writing that part yet.)


>i countered that the network effect behind windows is so strong that it usually more-than-overcomes the disadvantage windows faces in being closed source when it comes to resolving practical problems
this looks dangerously close to turning an is into an ought
>If you want the end of proprietary software, what you need to do is wield the power of the state, the power of class rule, and restructure it all away
that is necessary but not sufficient. you also need agitation. I'm not seeing many CPC-funded free software projects for example, at least not in the English-speaking parts of the Internet. perhaps there are Mandarin-language projects that I have missed, but I'd at least expect more participation in the FOSS community than we see
>muh software veganism
stupidest argument I've heard so far today. vegans concern themselves with the feelings of objects


While you can say MacOS uses like their UI you can't say the same for Windows. The average Windows user was fine with Aero in 7 (if not they put in a customized desktop) then Microsoft shoehorned Metro (that nobody likes) into the desktop and server editions of Windows because Microsoft had a stupid plan to take on Android.
Linux is not harder for normies to use then Windows. The issue is normies can walk into Best Buy and pickup a computer with MacOS or Windows on it but not Linux. So far nobody is complaining about the SteamDeck running Arch so if normies could just get a Linux machine pre-built from the big manufactures they would put up with its oddities.

Delete Post [ ]
[ home / rules / faq ] [ overboard / sfw / alt ] [ leftypol / siberia / hobby / tech / edu / games / anime / music / draw / AKM ] [ meta / roulette ] [ cytube / wiki / git ] [ GET / ref / marx / booru / zine ]
[ 1 / 2 / 3 / 4 / 5 / 6 / 7 / 8 / 9 / 10 / 11 / 12 / 13 / 14 / 15 / 16 / 17 / 18 / 19 / 20 / 21 / 22 / 23 / 24 / 25 / 26 / 27 / 28 / 29 / 30 / 31 / 32 / 33 / 34 / 35 / 36 ]
| Catalog | Home