Yeah, but running around with your hair on fire from half-finished “urgent request” to half-finished “urgent request” is still a recipe for complete failure all the same.
How you communicate where a customer’s urgent request falls on the roadmap may change, but it’s never a smart move to upend the roadmap just because the customer thinks something is urgent.
It's possible but difficult. Mainly, it requires setting expectations well, communicating what you're actually working on and why. Your field teams (sales, account management, customer success, sometimes even product marketing) are key for this. It's important to have good relationships with them such that they're highly informed partners.
The hardest part about this is that the problem does not get significantly better as you scale. There is always a biggest customer and they always matter a lot. Even AWS and Azure have to deal with this phenomenon.
Ironically the more money the contract is worth often implies the less thrashing you’ll be subject to, just because the project is so important to the customer.
Of course that $10MM contact may be huge to you but tiny to your customer. I mean big enough that your customer really cares.
Ideally the service/support team should be just that and not also doing development.
You don't have fire fighters who also work as bank tellers. When there's a fire you want immediate response. Same with servicing key customers - make them pay for the high quality of support.
> One consistent message from redditors has been that performance on the site and native apps could be better. We agree. That’s why the Reddit engineering team is working on making the Reddit platform faster and more reliable.
> A quick heads-up–this section is for engineers and robots. If you like a bit of nerdy tech talk, read on. If you don’t want to get lost in the technical details of what it takes to keep a site likeReddit running, you may want to skip ahead to the ‘Excellent’ section.
> Improving platform stability
> Last year, a major priority was improving feed load times (also known as Cold Start Latency) so that redditors could tap into their feeds and scroll through posts quickly, without waiting or watching little blue spinners tell them the page is loading. Because of those efforts, we saw drops in wait times across the board—iOS went down -11%, Android -19%, and the backend was down -25%. We also made improvements that reduced crashes and errors, resulting in a 64% reduction in downtime and a 97% reduction in background error rate.We’ll continue to invest in these sorts of latency and stability improvements, while also investing in a design system to componentize Reddit’s user interface (UI).
> Making Reddit faster, faster, faster!
> Another big factor in a webpage’s performance is how much stuff it loads. The number of requests for assets, the size of those assets, and how those assets are used are all good indicators of what sort of performance the site will generally have. Reddit’s current web platforms make a lot of requests and the payload sizes are high. This can make the site unwieldy and slow for redditors (especially in places that may already have slower internet service).
> We’ve already begun work on unifying our web (what some of you call new Reddit) and mobile web clients to make them faster, clean up UX debt, and upgrade the underlying tech to a modern technology stack. (For those interested in such things, that stack is Lit element, Web Components, and Baseplate.js. And the core technology choice is server-side rendering using native web components, which allow for faster page loads.) Stay tuned, because we’ll be sharing more on these efforts later in the year, and there’s some exciting stuff on the way.
Yes, you can actually. I haven't done this myself, but there are defconfigs for mmu less devices (stm32f4 series for instance). You need some more RAM though, but most evaluation kits don't provide this. Have a look here: https://elinux.org/STM32
I'm planning to give this a spin on an EVK that I got from work if I find the time.
Linux? yes. Popular distro? not really. Lack of MMU means that all the processes (and shared objects) use the same memory space, so unless you have some way of randomizing where things go (like SELinux), two processes will step on each other's data (MPU will not trigger segfault, as both are allowed to r/w same addr - IIUC). But even when using randomizing, you're playing russian rulette.
drivers, as you said, for filesystems for instance,
syscalls available that we know and love :),
somewhat easier deployment (elf instead of raw bin),
other abstractions that linux provides,
multiprocessing (if done carefully),
real multithreading,
Especially when it comes to despotic life-long leaders. After 2018, Xi no longer has to worry about the pretense of term limits. We can only hope the state of Russia after the Ukrainian conflict is a convincing enough lesson.
Invading Taiwan...the risks are substantial and the gain is legacy. Hardly a convincing for a nation, but potentially convincing for a man facing mortality.
> We can only hope the state of Russia after the Ukrainian conflict is a convincing enough lesson
We can also hope that the US refrains from creating another conflict to try to provoke China the same way they have been doing in Ukraine for the last 8 years. This "Pacific NATO" is the next thing. I am very pessimistic because the US is not going to allow China to overtake them in any way, and they have been behaving like that in their entire history.
It makes me more confident that the right people are responsible for this project. The results for such a critical instrument are very important to the success of future research. To see them consistently exceeding mission objectives...how high can the Webb go?
10 years ago flux inspired me to start down this path as well.
All the lights in our house automatically lower their color temperature and brightness at sunset every 15 minutes from 5000k to 2700k and 100% to 10%. SmartThings and the community created Circadian Daylight Coordinator have made this pretty straight forward to accomplish.
I don't have data, but anecdotally I can say it makes an appreciable difference in overall comfort.
Monitors are a depressing set of products. Every consumer has a slightly different set of priorities, and for me, the continuous cycle of compromises to arrive at a monitor that checks just over 50% of my requirements feels punitive. The lack of satisfaction is compounded further by the tour de force of reasonably priced and high-end display technologies packed into a sexy af piece of art hanging in my living room. Stepping from the living room into the office feels like a trek into the past.
I'm hopeful that QD-OLED will make that a less jarring transition. Mini-LEDs may help modernize the market, but I don't see them making a significant dent in my disappointment.
If you have the time and technical expertise of basic electronics (mainly soldering) and maybe 3d printing/basic CAD design, you can make your own monitor and that too at a decent price.
This. 3 years ago I built a 1mm thick 4K monitor by just mounting a bare panel to a thin laptop stand and hiding the eDP to DP board underneath. It was better than anything on the market and cost literally 3/4 as much as a normal fat and ugly monitor.
I’ll just add “Message several sellers and ask for a datasheet and 1pc price+shipping from each one” to Step 3.
Yep! Just about any driver board that can convert between different input formats will also have an OSD.
I’m kinda surprised people have so many questions about this. I dug up the emails. I paid $176 incl shipping/tax to zjtechhk for a B156ZAN03.4 panel, eDP cable, and a MST9U11Q1 driver board (which has DP, HDMI, and USBC inputs, eDP output, and a ribbon cable to a PCB with 4 buttons to control the OSD)
It was an awesome deal at the time, but I’m pretty sure all these components are obsolete by now.
But does it have a terrible menu system, that takes 1 button press to change the color of gaming LEDs but 10 button presses to change the input? If not, it will never compete with commercial monitors.
It’s for factory workers that assemble laptops. Despite the label, it’s perfectly ok to touch it. There’s a flex PCB behind the label that drives the backlight and converts the eDP signal to pixels. It’s normally protected by the laptop bezel so it’s somewhat fragile. If you bend it, you’ll break traces on the PCB or pop components off.
You are! My first laptop ~15 years ago had a super thin (1mm?) fluorescent tube below the panel where the flex PCB would be. It used thousands of volts to start the backlight.
My panel (and anything from the last decade) uses LED backlights so I would not expect anything above 60V.
The problem is precisely the selection of appropriately sized "TV quality and price" panels to make a monitor with not the lack of physical construction of such panels into monitors.
I'm not sure I understand you. Are you saying that the problem is selecting the right model out of all the available choices, or are you saying that there aren't models available at TV quality/prices?
Regarding the former, the website does have quite a few filters (though the navigation isn't the best but it works).
For the latter, yes quality can be quite a hit-and-miss when purchasing but prices are very reasonable. There may be a ~10-15% markup compared to a large-scale razor-thin margin monitor being sold at nearly a loss on Amazon, but most products are quite reasonably priced.
Re: the quality, if you're talking about advanced features you may be able to contact the distributors/manufacturers on the website selling them and ask them to make a custom order with more features (eg an extra port) for a slight price, though this varies.
The latter. As you say, finding panels that exist is easy even for a layman so it follows out of over 100 million monitors shipped per year at least one manufacturer would be using said all around vastly superior panel and you wouldn't have to DIY it.
I've used this method for repurposing laptop screens but I've never found it a significant cost saver (even ignoring time/labor) vs standard prebuilt monitors nor is anyone making such a panel exclusively for DYI-ers.
It's certainly true that for high-end displays I'd just go to Dell/LG/Samsung etc instead of AliExpress. Heck, anything that's premium enough to cover beyond sRGB is likely to be rare on Ali. But (imo) the biggest benefits of low price and easy accessibility to the low-mid end of products is the benefit.
Not saying that you can't get those 8k panels LG uses - but - sending 4k USD to someone over AliExpress feels quite risky even to me. You could probably get good stuff if you put in the effort but if you're earnings decent money from your CS job (I suppose most here are, I'm still a student) then it'll probably be easier to buy OEM/brand name.
They're usually called scaler boards, not drivers. Ones with LVDS outputs are common, but a lot of the new panels use VbyOne which seems to be rarer and more expensive to find a scaler for.
I've been going crazy and frustrated with the lack of HiDPI monitors on the market which can do integer scaling (27 inch 5K, 32 inch 8K) and here's a website that lists panels for such monitors.
I have no idea how to go about making a custom monitor but if it's feasible to solve my problem, I'm willing to learn.
So let's say I want to buy the LM270QQ1-SDA2, should I order that panel from an e-commerce website? Or should I find a monitor with that specific panel? Why is 3d printing needed here? Any article or detailed guide you can point me to?
I’d recommend buying the panel, eDP cable, and driver board from the same seller. I posted my experience doing the same above. Of course you could take a chance with random components and it’ll probably work as long as the connectors match, but why bother when Chinese vendors are willing to do the legwork for you?
You don’t need a 3D printer if you have another way to hold the panel (I glued it to a thin laptop stand)
Yes exactly. You need something to attach the panel to (unless you've got magic to float it in mid air lol). You can repurpose an old canvas/acrylic board, or go fancy with a custom 3d printed case.
Thing is, you could glue the panel to something as people have done. I find that detestable because you can't remove the panel non-distructably. A proper enclosure can hold the panel instead of sticking to it.
What are your priorities? A lot of pro photographers I know use iMacs because they are a cheap way to get an adequate computer attached to a monitor with high resolution and OK (for the price) color space
This is currently challenging because the 5K iMac is still on Intel which I’d be hesitant to buy into at this point. Hopefully resolved soon but we don’t know exactly when or what form it will take (e.g. will the 5K iMac be an “iMac Pro” and have a higher price than the Intel version?).
System agnostic (not an all-in-one), <35", HDR, OLED or OLED-backlight, >144hz, 1440p+ or higher, G-sync, 16:9, >=99% DCI-P3, doesn't look like a F-117 fucked a ferris wheel at an EDM festival, <$2k.
I'd be curious as to the differing priorities you are thinking about.
For me, I want real estate. At this point I'm looking forward to 8k in a 50-55" TV, good DPI, and 60Hz. Not a twitch gamer, I'm a developer. I use TVs.
Gamers want response time/Hz, decent appearance. They are the prime target of "Monitors"
Then there's the professional editors and the like. They used to be high-end monitors, but I think high-end 8k TVs will serve them as well.
What I am looking for is a large (at least 37.5") ultrawide OLED (or something with comparable contrast) with at least 3840x1600 pixels and a >= 120 Hz refresh rate. Basically something that is a straight upgrade from my current monitor [0] that improves the contrast without compromising on other factors (size and resolution are hard requirements, refresh rate, pixel response and color gamut are negotiable as long as they are good enough, brightness I don't care - have the current one set to 10%). Oh, it also needs to support FreeSync, but that seems to be less of a problem these days.
There are no panels that fulfil that at the moment. The nearest option woulb be getting a 4K OLED and then just not using part of the panel but that is hardly ideal.
Sure you’re not a gamer, but have you tried using a 120hz/240hz monitor for a week? I think you’ll find that it’s almost as massive an upgrade as going from 1080p to 4K.
Going back to 60hz is painful. You’ll see the cursor moving in a rotating square pattern when you’re moving your mouse in a circle. The lag is palpable.
I've been using a 144Hz screen for two years, I am a gamer, and I'll be honest: I don't notice much difference between consistent 60Hz and 144Hz. And I'm fairly sensitive to frame rate - when I'm watching a movie at someone else's house, I can tell in seconds they have the garbage "smoothing" or whatever features enabled.
Things that do stand out to me are input lag (again, usually only from TV-as-monitor with bad settings), bad colour space, and occasionally bad grey-to-grey time. I would take a 60Hz monitor over a 144Hz one if it meant avoiding any one of these issues. In a heartbeat.
one of the jarring things about using a TV as a monitor is that some TV models don't have the option of disabling the image processing that blurs the text so you end up with blurry text that gives headaches. I remember encountering this problem in the past with a specific samsung TV model where the processing was only disabled for the VGA port but not the rest of the ports (it was hard coded and couldn't be disabled).
Not the GP, but my current dream monitor would be a 3:2 or 16:10 OLED in the 24"-27" range with roughly 200 PPI and 120 Hz, preferably slightly curved, with hardware calibration for at least sRGB gamut. There’s nothing close to that in the market.
I just got an RGB OLED laptop with a gamut significantly wider than Display P3. It's just glorious. UHD content like 4K movies just pop in a way that you have to see in person. It's especially noticeable on military uniforms, where the various shades of dark green are much more distinct than on a typical monitor.
My priority is color accuracy, via hardware calibration (LUT) (no loss of gradations by OS-level or GPU-level mappings). I’d rather have an accurate sRGB display than a not-quite-accurate P3 (or, worse, "natural" wide gamut) display. Also, to display sRGB images (still the large majority of what’s out there) accurately on a wide-gamut system, you need 10-bit color depth at the OS/GPU level to not lose/distort gradations.
It's not sufficient for the display to be 10-bit, the OS and/or GPU (where the software calibration mapping takes place) must also work with 10 bits, and when graphics from different color spaces are combined on screen (UI graphics, images displayed, etc.), the OS must correctly map the source color space to the 10-bit output color space. All of that working correctly is not common-place yet.
Therefore, for dev work and dev-related UI graphics, I prefer to work in a calibrated "least common denominator" 8-bit sRGB space, because that's much easier to get right. However, in order to not lose color gradations to calibration, hardware calibration is then preferable.
Windows since Vista can use 16-bit float buffers for the desktop manager. Some applications support this too for all controls and UI elements. Desktop graphics applications support 10-bit, such as Photoshop. Similarly, video playback is generally 10-bit.
In the past, this feature was reserved for the ludicrously expensive "professional" GPUs like the Quadro series, but it has been enabled in software for the mainstream AMD and NVIDIA GPUs. Very recently (just months ago?) my Intel GPU gained 10-bit output capability even in SDR mode.
It definitely works, I used "test pattern" videos and test images in Photoshop, and even dark grey-to-grey gradients are silky smooth on two of my monitors. This includes a 7-year-old Dell monitor!
What are your requirements? I had no issue finding one that met all of mine at a reasonable price a few years ago. 144Hz, low latency, decent size, 2k resolution.
I've considered it, but can't play FPS effectively on anything over 34" due to biology and $ per sqft of real-estate. The seating position to keep the game in focus would be half way across the room.
OLEDs under 40" aren't TVs and almost all OLED monitors target revenue generating use-cases and are priced to match.
LG sells several 48" OLED TVs. To get the equivalent of 30" distance to a 27" monitor (about 48 degrees FOV), you need to be about 4.5 feet away from a 48" screen.
I just mounted mine to the wall, and the back of the desk is about a foot off the wall.
I have the Gigabyte FO48U, which uses the same panel as the 48" LG C1, and... it's a beautiful TV. That's what I wound up using it for, because I couldn't stand it as a monitor and went back to my old 27". Image retention is still a real factor for computer use, and maximizing a mostly-white window results in this incredibly jarring transition from a screen that's almost uncomfortably bright to one that's too dim.
"The Consumer Financial Protection Bureau (CFPB) is an agency of the United States government responsible for consumer protection in the financial sector. CFPB's jurisdiction includes banks, credit unions, securities firms, payday lenders, mortgage-servicing operations, foreclosure relief services, debt collectors, and other financial companies operating in the United States. "
It's definitely fraud. The only reason to hide the things they do is to mislead the customer as evidenced by previous cases of this that caused serious harm to consumers.
What do you expect? These companies are making toys for retail consumers. If you want devices that guarantee data integrity for life or death, or commercial applications, those exist, come with lengthy contracts, and cost 100-1000x more than the consumer grade stuff. Like I seriously have a hard time empathizing with someone who thinks they are entitled to anything other than a basic RMA if their $60 SSD loses data
There's a big difference in this depending on why the SSD lost the data.
If it was fraudulently declaring a lack of write-back cache despite a lack of observed crash-consistency, due to not just innocent negligience in firmware development, that's far different from some genuine bug in the FTL messing up the data mapping.
Personally, I expect implementing specifications properly. That's it.
About "commercial applications", let's face it. Those "enterprise solutions" cost way higher not because they are 10-1000x times "better", but because they contain generous "bonuses" for senior stuff.
In B2B, telling your largest customer their problem with the service isn't urgent, then attempting to educate them is going to be a mess.