It's a poor artisan who blames their tools. Take responsibility.
What I have been really annoyed about in the quality crisis in academic publications is that the editors at the academic journals basically take no responsibility.
And at least with data quality, there are some fairly simple data quality checks these editorial staffs can do.
I tend to use powerquery to import data from sql server to avoid some issues with merely copying and pasting data in excel. I can appreciate the pain over the Microsoft years.
One of my big issues is that I'm usually having to build things that other people have to use... who don't have my level of Excel knowledge (and have zero M, R, python, etc. knowledge... and minimal VBA).
I used to write some complicated solutions, thinking I was being clever, but later realized I was making a fragile system. So then I built spreadsheets that were "readable" to lots of people, and easy to update... and supposedly some spreadsheets I made in 2005 are still living on..... [and continuing to be updated]. Maybe I shouldn't have encouraged that, though. Hmmm.
Balancing complex solutions with user-friendliness is a common challenge when others with varying expertise use your work. Your journey from complexity to simplicity mirrors many professionals'. While your 2005 spreadsheets' longevity is a plus, it's important to strike a balance, allowing accessibility and collaboration, even if updates are needed. Your adaptability in making your work accessible is commendable, and it aligns with the positive disruption AI can bring by simplifying complex tasks and enabling broader access to advanced tools.
I have already used AI for a few things (I've mentioned before: I hate coming up with headlines for certain things, so I've used it for that purpose), and I'm really looking forward to deploying it for more uses.
It's just that the general use tools tend to be too sloppy for what I need. Even the stuff that supposedly can help you with python code. I'd rather search stackexchange and read through solutions there. People are getting outright wrong answers from the LLMs, and it's very annoying.
I've been amassing more "inappropriate use of LLMs in serious applications" examples, which I will likely be highlighting in future posts. Retraction Watch is one of my favorite blogs to read - they've been picking up examples over there.
In my view, LLMs exhibit a 70% hit ratio. LLMs are constantly learning and improving through our feedback. However, we should exercise caution in placing complete trust in them. I will take a look at the that blog soon.
To Excel is human, to Transform is divine. I am a fan of PowerQuery. I agree with the call for genomic accountability.
To complain about Excel itself is just lame.
It's a poor artisan who blames their tools. Take responsibility.
What I have been really annoyed about in the quality crisis in academic publications is that the editors at the academic journals basically take no responsibility.
And at least with data quality, there are some fairly simple data quality checks these editorial staffs can do.
I tend to use powerquery to import data from sql server to avoid some issues with merely copying and pasting data in excel. I can appreciate the pain over the Microsoft years.
Copy/paste is too manual, which invites all sorts of problems in process.
But powerquery has issues as well.
Part of the problem is that people don't want to actively think about their systems.
I agree, the M language takes some thought. I like the fact that you can create M functions without VBA 😁.
One of my big issues is that I'm usually having to build things that other people have to use... who don't have my level of Excel knowledge (and have zero M, R, python, etc. knowledge... and minimal VBA).
I used to write some complicated solutions, thinking I was being clever, but later realized I was making a fragile system. So then I built spreadsheets that were "readable" to lots of people, and easy to update... and supposedly some spreadsheets I made in 2005 are still living on..... [and continuing to be updated]. Maybe I shouldn't have encouraged that, though. Hmmm.
Balancing complex solutions with user-friendliness is a common challenge when others with varying expertise use your work. Your journey from complexity to simplicity mirrors many professionals'. While your 2005 spreadsheets' longevity is a plus, it's important to strike a balance, allowing accessibility and collaboration, even if updates are needed. Your adaptability in making your work accessible is commendable, and it aligns with the positive disruption AI can bring by simplifying complex tasks and enabling broader access to advanced tools.
I have already used AI for a few things (I've mentioned before: I hate coming up with headlines for certain things, so I've used it for that purpose), and I'm really looking forward to deploying it for more uses.
It's just that the general use tools tend to be too sloppy for what I need. Even the stuff that supposedly can help you with python code. I'd rather search stackexchange and read through solutions there. People are getting outright wrong answers from the LLMs, and it's very annoying.
I've been amassing more "inappropriate use of LLMs in serious applications" examples, which I will likely be highlighting in future posts. Retraction Watch is one of my favorite blogs to read - they've been picking up examples over there.
In my view, LLMs exhibit a 70% hit ratio. LLMs are constantly learning and improving through our feedback. However, we should exercise caution in placing complete trust in them. I will take a look at the that blog soon.