But that’s more key presses than just using existing keys
But that’s more key presses than just using existing keys
Don’t really get your point here.
They virtualize the file because it’s big. They know the size.
It does indeed scale with the size of the file. That’s exactly the problem.
I wish they’d charge a little. Supporting them feels so good and I’ve run out of children to buy the game for haha
Since we aren’t using those it is in fact a climate disaster currently and they know that.
Assuming you want to ground your argument in our reality.
No it relies on the c# project files. It looks for all projectreference tags in the projects file and recursively grabs all of them and turns them into filters.
You have a list of filters like “src/libs/whatever/*” if there is a change the pipeline runs.
I wrote a tool that automatically updates these based on recursive project references (c#)
So if any project referenced by the service (or recursively referenced by dependencies) changes the service is rebuilt.
If pretty much gets compiled to a goto statement. Well more a jumpif but same principle
A certain world event being a 3rd party piece of software having a bad update.
We use a mono repo for a new cloud based solution. So far it’s been really great.
The shared projects are all in one place so we don’t have to kick things out to a package manager just to pull them back in.
We use filters in azure pipelines so things only get built if they or dependent projects get changed.
It makes big changes that span multiple projects effortless to implement.
Also running a local deployment is as easy as hitting run in the ide.
So far no problems at all.
Not sure what you mean? The hardware runs the software tasks more efficiently.
It’s hardware specifically designed for running AI tasks. Like neural networks.
An NPU, or Neural Processing Unit, is a dedicated processor or processing unit on a larger SoC designed specifically for accelerating neural network operations and AI tasks. Unlike general-purpose CPUs and GPUs, NPUs are optimized for a data-driven parallel computing, making them highly efficient at processing massive multimedia data like videos and images and processing data for neural networks
But you can run more complex networks faster. Which is what I want.
An NPU, or Neural Processing Unit, is a dedicated processor or processing unit on a larger SoC designed specifically for accelerating neural network operations and AI tasks.
Exactly what we are talking about.
I’m a programmer so when learning a new framework or library I use it as an interactive docs that allows follow up questions.
I also use it to generate things like regex and SQL queries.
It’s also really good at refactoring code and other repetitive tasks like that
I use it heavily at work nowadays. It would be nice to run it locally.
Sow do you plan to pay sites for the resources you use?
Tabs would make so much more sense. 1 character per indent.
Might be easier to use a capture card or similar to just record your pc.
I have type 1 diabetes and on bad days really struggle with brain fog.
Make sure you have all your tasks written down and organised, ideally in some form of project management software.
Things get easier the more programming you do. Once you’ve called a method for the 10,000s time you won’t need to think about it.
I find ChatGPT is really good for reminding you of syntax and things like that. No matter how many times I do it I can’t remember the ordering in a c# switch expression so I just ask for the syntax.
You’ve linked to the save page and it’s failing. The link works if you remove /save/ from it