There is nothing that an AI Researcher does that an AI Engineer cannot do. In fact the following things are a majority aspect of a researcher's job function.
They research new ways of doing things, the magic word here is explore. They don't build systems that can scale to hundreds and thousands of data points, for that they need an AI Engineer.
They read research papers, which an AI Engineer also does and apply them in context
They stay current on AI Trends, which an AI Engineer also does and apply them in context
They hold a Phd, which an AI Engineer may/may not hold nor necessarily care for to do their work
They spend time building novel algorithms, which is a bit questionable as they have poor programming skills and lack the ability to scale any of the algorithms, on top of which to solve a problem they will need practical experience which one does not get from reading research papers. Again, an AI Engineer can pretty much do this and likely better, and have the mindset to apply design patterns by doing it the correct way from prototypes to scalable solutions.
They get sponsorship funding for research work, which they get from holding a Phd but not necessarily from actually having skills at converting theory into practice for that again they need an AI Engineer. They also lack the ability to manage data and what they have learned they find it difficult to adapt to change in research approach. They seem to stick to what they know rather than learn what they don't know as part of adapting to change. They rarely like to approach things outside of the boundaries of their so called specialist area. Funnily enough, it is in most cases a research assistant at academic institutions that tends to build the practical application of theory not the professor.
They are specialized in a certain field, have likely taught in academia, given conference talks, and have published papers in the area. One can't be a specialist in an area if they have no practical skills of applying any of it. It is questionable whether any of what they publish is even worthy of research, precisely why 80% of research coming out of industry and academic institutions amounts to nothing. And, the 20% that does get classed as a research breakthrough tends to be built by someone sponsored by an organization, looking to solve a certain practical issue, who doesn't hold a Phd, and likely with a practical engineering background. Furthermore, it is surprising that many AI Researchers make poor teachers. Invariably, many, also dislike the aspect of teaching but have to as part of being linked to an academic institution.
They collaborate on standardization effort and open source artefacts as part of converting theory to practice. Even here majority of open standardizations are done by people with practical experience who understand the gaps in an application area. An AI Researcher rarely produces anything of significant value but usually tries to take credit for a large majority of the effort which is likely done by an AI Engineer.
One can notice that practical application is far more important than theory. It is from practical experience one can understand problems and extend towards a solution. There is also no one way of solving a problem. Basic theoretical background can be achieved without going to university or even attaining a Phd. There are books and online courses for literally anything and everything one can think of.
They have greater opportunities for academic influence and research. This may be true because they have built up a network within the area for collaboration. However, most academic institutions get their funding from either government trusts, grants, or private sector. Unless the AI Engineer has a sponsorship or a network of associates that can provide funding for their work, they may be at a slight disadvantage. AI Researchers also as a result of networking tend to have a higher sense of reach towards influence but even this can be targeted back to an AI Engineer who develops influence through practical applications. One way of beating an AI Researcher at their own game is to build open source projects, provide published papers for them, and build a portfolio of practical solutions delivered to organizations. One doesn't need an advanced degree for practical achievement. Invariably, it is far more important to have the tenacity, curiosity, and enthusiasm to learn, explore, and extend towards building a practical novel solution.
AI Researchers tend to have a more focused academic background, with limited practical experience, as defined on their resume with publications and conference talks, but an AI Engineer tends to have a stronger practical bias where they may or may not have published papers. However, at times the job functions may be interchangeable given the confusion and disarray in communication at most clueless organizations. A Phd background is not an automatic pass to the person being an expert in that area. One may at times come across people calling themselves AI Researchers with a Phd but in fact are completely clueless about the field or the application of it. One has to be mindful when hiring such people. There are many ways of spotting the fake AI Researcher, many of which relate to their lack of objectivity, their questionable attitude, their questionable understanding about a topic, their confused sense of ethics, and their lack of critical evaluation process:
- they confuse Neural Linguistic Programming (NLP) with Natural Language Processing (NLP)
- they lack professionalism and respect when interacting with non-Phd people
- they have a history of practicing academic dishonesty which in the work place interaction converts into unethical practices and false sense of entitlement
- they are often hypocritical with their notion of ethical principles and code of conduct
- they don't treat others with respect and have an in grain sense of being overly defensive and biased in their communication
- they have published papers that show very little critical evaluation
- they have published papers that are likely plagiarised
- they have published papers that are not theoretically correct
- someone else may have written a published paper for them, for which they took credit, eg via crowdsourcing, most of the work done by a supervisor to meet passing research indicators
- they are generally clueless and contradicting themselves with their own actions and explanation and digging themselves into an even bigger hole of illogical thinking
- they haven't really published much after their thesis work
- they have a mediocre citation score to majority of their papers
- they lie about their background
- they violate basic privacy laws during meetings, are rude in their interaction, or appear insecure by trying to invalidate others
- they are not self-critical of their own work and of their own deficiencies, they spend more time criticizing others rather than self-reflection
- they think they know what they are talking about just because they hold a Phd
- they have peculiar mannerisms and the way they come across about a topic makes you question their qualifying background
- they display an unwelcoming or condescending attitude
- they like to use a lot of flowering language and impress upon how busy they are, even if they aren't that busy at all
- they will try to impress upon their background, but will get caught in using incorrect terms, incorrect logical thinking, and automatically invalidate their position
- they may use a lot of assumptions in their speech without backing their claims
- they are unable to translate anything into any form of practical output, and what little they produce is packaged up as an API wrapper around someone else's work
- they will be enamored by the academic institution that they attended for their Phd, keep using that as their defense, but have little to no contextual understanding about concepts in any level of technical depth when applied in practice
- it is very easy to put them on the spot and expect them to remain speechless of being caught out
- they have a tendency of using cognitive biases in their actions and speech
- the way they interact, if the workplace CCTV was played back to them, it would not only be embarrassing, but display both their rudeness and unprofessionalism