bachmeier an hour ago

I was initially interested in Positron, until checking out the license:

"Positron is licensed under the Elastic License 2.0, a source-available license. This license makes Positron available for free to everyone to use, build on, and extend for personal, academic, and commercial use. Its primary restriction is that you can’t host Positron as a service to third parties without Posit’s agreement. This restriction is necessary for us to build a sustainable business around Positron while also offering it free of charge to the community."

"You may not move, change, disable, or circumvent the license key functionality in the software, and you may not remove or obscure any functionality in the software that is protected by the license key."

This is not a recommendation on whether anyone should use Positron. I think it's fair that people know it's just another piece of proprietary software with a license key. Posit is a public benefit corporation, which sounds nice, but you're still subject to the same games any other for-profit private company plays with its customers.

(I use Posit Cloud in my teaching, so my interactions with their products are as a customer, but I use RStudio, which is open source.)

juujian 17 minutes ago

Was looking forward to trying this, but they haven't got inline chunks for rmarkdown to work yet. This I use religiously. Hopefully they'll be able to ship it eventually, but based on the GitHub issue about it it will be a while.

affenape 6 hours ago

> A %PRODUCTNAME% next generation editor/ide is released

> Look inside

> VS Code

  • muragekibicho 3 hours ago

    VS Code is the chromium of IDEs. I shall not explain further.

    • hirako2000 an hour ago

      And when you look inside VS Code, it is Chromium.

k310 8 hours ago

Formerly RStudio

> RStudio (now Posit) was founded in 2009 with the vision of creating high quality open-source software for data scientists. We’ve grown exponentially over time but our culture remains unchanged. We invest heavily in open-source development, education, and the community with the goal of serving knowledge creators 100 years from now.

> We want Posit to serve a meaningful public purpose and we run the company for the benefit of our customers, employees, and the community at large. That’s why we’re designated as a Public Benefit Corporation. As a Certified B Corp, we must meet the highest verified standards of social and environmental performance, transparency, and accountability. Our directors and officers have a fiduciary responsibility to address social, economic, and environmental needs while still overseeing our business goals.

  • anewhnaccount2 7 hours ago

    So the main news is that they're giving up on develping an independent IDE and turning into another VS code fork. The loss of biodiversity and reliance on a no-so-reliable steward is mildly concerning.

    • cwnyth 7 hours ago

      They (reps? devs? I don't remember) have recently mentioned that they won't give up on RStudio, that it will stay separate from Positron. I really hope that stays true.

    • uniqueuid 5 hours ago

      OTOH, posit funds a lot of development of important packages in the tidyverse and does a lot of community work etc.

      So if maintaining RStudio is so much of a burden that it impedes the rest of their work, I don't think it's a bad idea to reduce the amount of work spent trying to compete with VSCode when that's an increasingly tough sell.

      I'm not a fan of VSCode personally, but would probably be happy with a tmux setup with a console for R and some minimal output viewer, so people like me should be able to cobble something together that's a workable alternative to Posit.

  • benrutter 6 hours ago

    I'm not sure. wording like this:

    > We anticipate many RStudio users will be curious about Positron.

    Heavily implies it as a seperate thing that will continue to be maintained. They haven't said they're getting rid of support for Rstudio.

    I think this is probably more that Posit have been trying to move more and more into the Python space, since that's where most data science is happenening. Rstudio has a great but is obviously very associated with R, so making a similarly intended project that is more explicit is supporting other languages isn't inherently a bad shout.

    • philipallstar 5 hours ago

      It will continue to be maintained, but if lots of R people move to Positron then RStudio's features will start to lag, and they'll eventually deprecate it.

    • shellfishgene 5 hours ago

      Unless this has recently changed, the support for LLM coding tools in RStudio is so bare bones that I would expect many users to switch to Positron just for that.

  • specproc 5 hours ago

    I sent this to an R friend, and he was like, "yeah, it's been changed for a few years now". Is he missing something or has there been a major version or something?

    • almostkindatech 4 hours ago

      May be mixing-up the company change with the IDE: Posit, the company, was named a few years ago, whereas Positron, the IDE, is new.

      • ellisv 43 minutes ago

        The IDE has been available for awhile.

  • fithisux 3 hours ago

    Not true.

    But the community can maintain it.

    The Rstudio users can give a roadmap and ask for help.

ktrask 3 hours ago

I hope positron works out fine, last time I checked it was not yet usable.

Replacing Rstudio with something more reliable would be nice, because of some major design flaws Rstudio has. A lot of the UI stuff runs also in R, so when the R kernel dies, quite often I cannot save unsaved files. So I need to copy the file content to a different text editor when that happens. I also don't understand why the LLM-Chat window is running inside the R console, and then blocks running R code. That makes it completely unusable.

akst 5 hours ago

I know "next-generation" is just SEO slop, but I'm going to hyper fixate on this for a moment (so feel free to ignore if you're actually interested in Positron).

I think the future of data science will likely be something else, with the advent of WebGPU[1] (which isn't just a web technology) and the current quality/availability of GPUs in end user devices, and a lot of data computation clearly standing to benefit from this.

The real next generation of data science tools will likely involve tools that are GPU first and try to keep as much work in the GPU as possible. I definitely think we'll see some new languages eventually emerge to abstract much of the overhead of batching work but also forces people to explicitly consider when they write code that simply won't run on the GPU, like sequential operations that are nonlinear, nonassociative/noncommutative (like highly sequential operations like processing an ordered block of text).

I think WebGPU is going to make this a lot easier.

That said I'd imagine for larger compute workloads people are going to continue to stick with large CUDA clusters as they have more functionality and handle a larger variety of workloads. But on end user devices there's an opportunity to create tools that allow data scientists to more trivially do this kind of work when they compute their models, process their datasets.

[1] Other compute APIs existed in the past, but WebGPU might be one of the most successful attempt to provide a portable (and more accessible) way to write general GPU compute code. I've seen people say WebGPU is hard, but having given it ago (without libraries) I don't think this is all that true, compared to OpenGL there are no longer specialised APIs to load data into uniforms everything is just a buffer. I wonder if this has more to do with non JS bindings for use outside the browser/node or the fact you're forced to consider memory layout of anything your loading into the GPU from the start (something that can be abstracted and generalised), just in my experience after my first attempt at writing a compute shader it's fairly simple IMO. Like stuff that always complicated in rendering like text is still complicated, but at least its not a state based API like web/opengl.

  • ssivark 2 hours ago

    Interesting question. I don't know much about WebGPU, but I'd posit (heh!) that the GPU on the client devices doesn't matter too much since folks will likely be working over the network anyways (cloud-based IDE, coding agent connected to cloud-hosted LLM, etc) and we also have innovations like Modal which allow serverless lambdas for GPUs.

    As long as silicon is scarce it would make sense to hoard it and rent it out (pricing as a means of managing scarcity); if we end up in a scenario where silicon is plentiful, then everyone would have powerful local GPUs, using local AI models, etc.

    • akst 40 minutes ago

      I guess in my mind I was thinking use cases other than AI. Like statistical or hierarchical scientific models, simulations or ETL work. I also don't know if some of the econometricians I know with a less technical background would even know how to get setup with AWS, and I feel more boardly there's enough folks doing data work in a none tech field who know how to use Python or R or Matlab to do their modelling but likely isn't comfortable with cloud infrastructure, but might have an apple laptop with apple silicon that could improve their iteration loop. Folks in AI are probably more comfortable with a cloud solution.

      There are aspects of data science which is iterative and you're repeatedly running similar computations with different inputs, I think there's some value in shaving off time between iterations.

      In my case I have a temporal geospatial dataset with 20+ million properties for each month over several years each with various attributes, it's in a nonprofessional setting and the main motivator for most of my decisions is "because I can and I think it would be fun and I have a decent enough GPU". While I could probably chuck it on a cluster, I'd like to avoid if I can help it and an optimisation done on my local machine would still pay off if I did end up setting up a cluster. There's quite a bit of ETL preprocessing work before I load it into the database, I think are portions that might be doable on the GPU. But it's more so the computations I'd like to do on the dataset before generating visualisations in which I think I could reduce the iteration wait time for processing for plots, ideally to the point I can make iterations more interactive. There's enough linear operations you could get some wins with a GPU implementation.

      I am keen to see how far I'll get, but worst case scenario I learn a lot, and I'm sure those learnings will be transferrable to other GPU experiments.

  • hatmatrix 4 hours ago

    It's worth considering what nextgen really would be, but probably VSCode and its forks will dominate for the time being. I recall Steve Yegge predicting that the next IDE to beat be the web browser, and this was around 2008 or so. It's not the reality, but took about 10-15 years for it to actually happen, even though there were earlier shots at it by like Atom.

  • hhh 5 hours ago

    check out the RAPIDS ecosystem from 2018 or so :)

    • akst 40 minutes ago

      This looks interesting, thanks for sharing.

hwj 3 hours ago

Reading the title, I expected this to be the successor of Electron.

Or at least a positive version of it...

  • Yizahi an hour ago

    Each Positron installation annihilates one Chrome clone from the PC and frees up 1-2 Gigajoules of RAM in the process.

SuperNinKenDo 6 hours ago

Oof. That's a damn shame. I think languages and use-cases like this are the perfect place for purpose built IDE development. If even these guys are turning into a VS Code downstream that's just sad.

Coincidentally I was thinking of giving R another go, but honestly now... I'm good...

  • postexitus 5 hours ago

    If (That's a big if) they can give RStudio experience in VSCode environment, benefiting from the plugin ecosystem etc. why not?

    RStudio is great when you are doing your own thing, but when it comes to more generic tools like Git, LLM, Autoformatting etc. it's a hard pass.

    • SuperNinKenDo 3 hours ago

      The why not is that they'll constantly be wrestling with, and limited by the generalist nature of a tool like VS Code. While generalism has its benefits and its drawbacks, what it most definitely does not give you is a fully bespoke experience, by its very definition, and I'm simply bemoaning that specific loss, because I like to see specialised tools sometimes.

  • philipallstar 5 hours ago

    Who cares if it's VSCodium-based?

    • irilesscent 4 hours ago

      I feel like a lot of these can be packaged as an extension for vscode. I'd rather not have multiple different variations of the same ide, too much duplication.

    • IshKebab an hour ago

      Because generally they don't add anything that couldn't have just been a VSCode extension... in which case it would really be better for users if it was a VSCode extension. The only reason they don't do that is for branding & control purposes.

      There are exceptions. E.g. Theia actually does enough stuff differently that I think it warrants being its own thing. At least it did. Looks like they have jumped on the AI bandwagon too.

      Maybe this is the same; I haven't looked at it in detail. But "we have an IDE! (don't tell them it's vscode)" feels a lot like "we have an app! (don't tell them its a webview)".

    • SuperNinKenDo 3 hours ago

      I like seeing specialised, bespoke tools existing, that's all.

cons0le 6 hours ago

"Next generation IDE" comes out like every fuckin week. I already tried 3 new ones this month so I'm done. It looks nice tho

BrenBarn 7 hours ago

Wow so RStudio has switched from being a "real" desktop app to another webview-based thing? Bummer. I hadn't used RStudio for some time but now I probably will continue to not do so. . .

  • postexitus 5 hours ago

    It was actually the case for a very long time - albeit a very successful one, so you never realized.

  • uniqueuid 5 hours ago

    It has always (or at least for decades) been.