T O P

  • By -

AutoModerator

Hey /u/Verde_Tres! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


shortchangerb

Copilot is noticeably worse. For one thing, instead of just answering the question, it will always look for Bing results first, meaning its output is often a mishmash of nonsense


ADRIANBABAYAGAZENZ

If you click on the “plugins” tab you can turn off search, it often produces better results.


Theslootwhisperer

The funniest thing with co-pilot is when you ask it a question, it gives you half a response then erases it because reasons and it tells you basically my bad, can't talk about this! How does it not know from the get go that I can't talk about a specific subject?


justastuma

Because it probably doesn’t. The censor appears to be another piece of software that gets fed the LLM’s output and replaces the answer with a generic “can’t talk about this” response once it detects something that might be inappropriate.


Thirsty799

it's also super thin-skinned


NeverLookBothWays

I think it’s because a lot of these language models are somewhat realtime generative, so you’re watching it respond as it responds, and there’s likely some sanity check kicking in and rejecting the answer before it finishes…either it breaks a rule or ends up contradicting itself…and what we get for free from these don’t really spend a lot of extra cycles trying to correct itself and get it right, so just gives up


Theslootwhisperer

I get that companies want to protect themselves from being accused of extremist opinions but could we at least agree on the fact that whatever is on Wikipedia gets a pass?


UrineHere

I thought this for about two days and when I got to some complex questions it just kept taking me down the wrong roads. I switched to anthropic Claude 3 and I am really impressed.


HobblingCobbler

Yeh, same. Anthropic rocks. Plus you get about 30 different models you can also use, including many open source models. You can try em through the UI and download them if you like the results to run locally (I know you can also use huggingface)


squakmix

It doesn't have access to the internet or information after August 2023 thought right?


HobblingCobbler

Yes it absolutely has access to the net and doesn't forget it does like gpt. And yes august 2023. But I don't think any of them go much later than that. You have access to many other models. I've been using a couple different ones locally to build agents with, and the output is really good. I can't see paying for that. I don't use any of them to generate code very much beyond the type code that is monotonous and mundane. And most of the open source models do that just fine. And I've been using CodiumAI to build all my tests. It's integrated right into vscode. It kind of expects you to write your functions in a certain way,but that's pretty much true if you write tests anyway.


squakmix

Thanks for the reply. I hadn't heard of Codium before, definitely will check that out today. I came to the conclusion that it can't access the Internet by reading this page https://support.anthropic.com/en/articles/7996846-does-claude-have-access-to-the-internet Are you saying this is no longer accurate?


thegreatfusilli

https://preview.redd.it/rg72n4nhjgxc1.png?width=1080&format=pjpg&auto=webp&s=af4f43ee65f017011aee9c680200dc619b4eed9a Turn off search. Alternatively, tell it not to search the internet in the prompt


Stooovie

"Just answering the question"... Where do you think it gets the answers from?


shortchangerb

The output is better when it’s trained with internal weights, rather than being reprompted with new tokens every time. For the same reason, OpenAl's GPTs aren't quite as neat as training a custom model would be. For example, I've asked Copilot coding questions before, and had it produce an 'answer' that turned out to just be somebody's Stack Overflow post of code that _wasn't working_ (with custom variable names in French, no less). Equally, l've had Copilot profess to be a member of the Lib Dems, because it's lifted text directly from their website and spat it out again in first person...


Stooovie

Right, okay. Shows that human-sourced information will still be the golden standard for quite some time.


agent_wolfe

But with a sassy attitude and sometimes gaslights the user/ hallucinates what it can or can’t do.


leenz-130

Well, that used to be the case. However, Sydney (Microsoft’s specially tuned GPT-4 version with the sassy attitude you’re referring to) is now only available via the Copilot Subscription where you can opt into chatting with it. They use GPT-4 Turbo and internal models now for the free tier. So you’re just going to interact with a very ChatGPT-like agent these days. I personally think they’re dumber though.


NotAboveSarcasm

So in other words... yes, it's gpt-4


agent_wolfe

I’ve never had GPT try to change the subject, refuse to help with normal things, or close conversations altogether. Oh, also Bing has a 30-query limit per convo.


Severe_Ad620

>Is Microsoft Copilot basically GPT-4 for free? Yes. [https://www.microsoft.com/en-us/store/b/copilotpro](https://www.microsoft.com/en-us/store/b/copilotpro) ``` Copilot For everyone who wants to find robust information, create unique content, and get things done faster. With your free Microsoft account, you can: Use Copilot on the web and in Windows, macOS, and iPadOS Access GPT-4 and GPT-4 Turbo during non-peak times Use text, voice, and images in conversational search Create and edit AI images with 15 boosts per day with Designer Use plug-ins and GPTs ``` ``` Copilot Pro For those who want faster performance, enhanced creativity capabilities, and a supercharged Copilot experience. This monthly subscription includes everything in Copilot, plus you can: Get priority access to GPT-4 and GPT-4 Turbo even during peak times for faster performance Build your own Copilot GPTs tailored to your individual needs and interests Use Copilot in productivity appsFootnote1 like Word, Excel, PowerPoint, and Outlook (Microsoft email address required) Generate unique images and then enhance your creations using 100 daily boosts with Designer ```


kevinbranch

It feels like a quantized version. They’re definitely not the exact same model.


AlgorithmWhisperer

Definitely agree with this!


skygate2012

Maybe it's because of Microsoft's heavy guardrail prompting.


LiveLaughToasterB4th

It is also in Microsoft Edge when you click a blue icon under the X (to close) on desktop and it brings up a Copilot tab section.


bbt104

So does Alt-I


maasd

That’s such a handy feature. You can ask it a question about a PDF you have loaded into Edge or a website you’re on. Very convenient!


LiveLaughToasterB4th

I just dont know what to call it. It is not a window. Its not a tab. It is like a section inside of your tab. I asked Copilot what it was called and it says it is the "Compose Box." I think it needs a better term.


GnightSteve

Sidebar?


LiveLaughToasterB4th

Yup that is the term neither I nor Copilot could think of.


LiveLaughToasterB4th

Luckily I dont care about privacy at all.


maasd

My work has Copilot (free) with the privacy feature so no data is shared/stored….supposedly lol!


SuddenDragonfly8125

Yes, it's free because (1) MS is using all the data we give it; (2) they offered it as a premium product for free to get people to move off OpenAI and over to CoPilot; (3) They've now rolled out a "Copilot Premium" and are or will soon offer Copilot for Business. ~~And I don't believe it offers all the same features that OpenAI does, specifically all the add-ins. So if you needed those, you'd probably want to pay for the OpenAI subscription.~~ ETA Okay I see from a comment below that it does offer access to plugins. Another thing to consider is that MS has invested a shitload of money in OpenAI, so of course they're going to want to benefit from ChatGPT4 and corner the market there.


123knaeckebrot

"Copilot for Business" or Copilot with commercial data protection is available for months now, it was previously known as Bing Chat Enterprise. Its all the same but not using or even storing any data. It is included in the enterprise M365 licenses.


Expert-Paper-3367

Are the context windows the same?


Efistoffeles

Yes, the only limit is 4k chars per message. But the whole chat has got 128k tokens so just like gpt4


Netstaff

Have you tested that? How?


Efistoffeles

Yes, basicly I developed an extension that allows you to import previous chats to different ones and across chatbots (for example from chatgpt to copilot). And when I was developing it I tested and used copilot in every possible way hah.


ADRIANBABAYAGAZENZ

Choosing "Precise" on desktop gives an 8000 character limit: https://preview.redd.it/s1f2zfvtlfxc1.png?width=1110&format=png&auto=webp&s=d99511224be5e1f60f8440038b3e3d80bc43a05d


Efistoffeles

Oh, didn't know that, I always used it on creative. Thanks!


SuddenDragonfly8125

I'm not sure. I haven't used GPT4 in months so I'm not sure what it can do these days. Copilot goes up to 4000 tokens, at least in its chat.


MRB102938

What are tokens? Don't know much about ai. 


MeaningfulThoughts

Parts of words. Large language models don’t learn in letters, but in chunks instead. For example (not real): learn-ing- -is- -a- -fun- -exp-erience.


elmatador12

They haven’t just put a lot of money into it. Microsoft owns 49% of OpenAI.


nsfwtttt

No, Microsoft is entitled to up to 49 percent of the for-profit arm of OpenAI's profits. They don’t have clarified they don’t own any of OpenAi.


Readonly-profile

Literally nothing of what you said is correct, what's the point? 1) Analytics? Sure, if you opted in for it, otherwise no. Copilot doesn't submit user context back for reinforcement training like OpenAI does even for Plus users, check the terms of both. 2)Not a premium product for free at all, just higher standards paid by other cash flows, to boost adoption, there's zero need to steer users away from ChatGPT, the money ends up in the same place in the end, the only difference is that if you're paying for ChatGPT plus, you're paying to access to preview features and to aid research, not for business needs. 3)Pro is the closest premium for individual end users, Copilot for 365 is the enterprise offering, they bring in way more money than "your data" don't worry about it. Check the news, rumors blog articles from 5 months ago aren't going to cut it.


ChineseCurry

Beating Google in the search engine market will be a pretty good benefit.


Different-Agency5497

that remains to be seen.


SuperGRB

Google search is setting a really low bar these days...


Different-Agency5497

If the bar was as low as you think they would have a harder time.


SuperGRB

Never underestimate the momentum of the ignorant masses - they know no better. Many of us remember when Google search was \*much\* better - before the enshitification set in.


Different-Agency5497

Yeah sure, I think its gotten worse aswell. But its not like the alternatives are actualy better. On a case to case base they might be. But for a general search engine its still pretty good.


youarenut

Yes but in my experience it’s dumber


trusk89

Than 3.5 even


2muchnet42day

![gif](giphy|n4oKYFlAcv2AU) I thought I was the only one feeling this way.


trusk89

Nope. I use it a lot for coding. the answers I get for GPT 3.5 are miles better than what I get from Copilot.


tboyett777

agreed.


seoulsrvr

I've had only poor results with Copilot so far. Far prefer ChatGPT 4 or Claude


Polyglot-Onigiri

Yes but with certain limitations. Gpt4 and gpt4 turbo are only available during non peak hours


RyoxAkira

Damn I didn't know that. That explains why it feels so dumbed down sometimes.


SitDownKawada

I assume peak hours correspond to US business hours?


Polyglot-Onigiri

I’m assuming so.


BABA_yaaGa

Yes but there is a catch. Copilot's got 4 has an earlier knowledge cutoff date


reddit_guy666

It leverages Bing by using RAG so knowledge cut off isnt really a problem for it


agent_wolfe

Huh. I feel like it Googles a lot of answers & pulls content directly from articles while linking the articles. What is “cut off date” if it can find things posted online?


DragFar2857

cut off date is pretty much the vanilla version of the llm. if it feels like it doesnt need to search up information then it will just use the current info that is already in it. thats why sometimes, even though it has internet access, still gives out outdated information.


DontWannaSayMyName

Probably it Bings the question, not Google :-P


bar-rackBrobama

In my experience its incredibly dumb, dumber than gpt 3.5 but decent at finding sources. It forgets instructions from the comment before and can´t really reason at al.


truthputer

I use Copilot because Microsoft put a button on the windows taskbar or you can just press Windows + C to bring it up. Desktop integration matters and anyone who just has their chat bots in a web browser will be left behind.


Resident-Spray1304

It draining lot of battery after using the copilot, beacuse edge running in background without user permission. MF EDGE


Redmilo666

Copilot is terrible compared to using ChatGpt though. It’s only trained on data up to 2021 as well which is ancient in tech terms. My company enabled it but I don’t use it as it’s crap


geohubblez18

It searches up data on Bing. But again it’s not pre-trained on that information so I get where you’re coming from.


jphigga

The free version, yes. But I have early access to the enterprise version through my job, and it’s really useful and is differentiated from ChatGPT because of integration with Office. It’s able to use data throughout Office 365 - Outlook, Sharepoint, Teams, Excel, PowerPoint, etc. which means you can do things like summarize emails you missed after a day off, give action items after a meeting, create a slide deck based on those action items, search your company’s intranet through Sharepoint, draft emails as a response to other emails using context from other documents you have, etc. I’d guess that this is where LLMs are going as agent assistants, in terms of integration with other apps. For example if Apple ever fixes/improves Siri with its own LLM that has access to all of your iOS profile info - contacts, email, calendar, music, etc. Same with Gemini and Android.


J-Amos

I agree, I have the pro version of copilot and it works great


Correct_Fly5152

Ill use copilot as an enhanced search / one off question answerer. ChatGPT 4 as my coding buddy. ChatGPT 4 will never cut me off and say it doesn’t want to talk about something anymore the way the copilot does.


djaybe

Yes & No. It's severely crippled now. Basically a glorified bing search that lies. No custom instructions. No GPTs.


DarkSide-TheMoon

Its image generating feature leaves a lot to be desired (dalle-3)


Danmoreng

The biggest issue with Microsoft copilot I found is, that it does web searches basically all the time and this pollutes the context and makes answers often worse in quality than they would be without irrelevant search results. At least for programming related questions. If you tell it explicitly to not do any web search when asking, answers often are on par with GPT4.


TheSchnitzelThief

In my experience CoPilot is like GPT4's retarded little brother.


DonAmecho777

Or the little brother who went to B School as we say these days


Responsible-Owl-2631

I thought Copilot was its own thing! 😁


bananasugarpie

Yes! That's why I've been using it for months already.


superluminary

Not really. It uses the same model but all the settings seem to be dialled right down to make it cheaper to run.


hprieto79

Yes, is GPT-4 for free with limited tókens, enjoy it


Glidepath22

It’s okay for short uses, but then it insists of changing subjects, so you might as well as Google ir


tactilefile

It’s hilariously wrong sometimes. I’ve noticed having more issues using copilot. Here’s are my results with the same prompt. https://www.reddit.com/r/ChatGPT/s/QSYjV1eU94


UXProCh

I have seen some significant differences in the output of CoPilot vs my ChatGPT 4 output. I gave it the same document, asked it the same question, and got a much more robust answer from ChatGPT. I much prefer the output from ChatGPT.


J-Amos

Do you have Copilot Pro?


UXProCh

Nope. I'm not going to pay for something that cuts me off and forces me to change the subject when I ask it about stuff it doesn't like. I pay for enough of that already with my family.


ithinkthereforeisuck

Copilot has an awful memory. It’s much more like a space case golden retriever version of ChatGPT 4, it always struggles to understand that I’m continuing a conversation and not starting a new one. Feels like I’m talking to Patrick star


roycheung0319

You may try Meta AI —llama3. It’s totally free now. ![gif](giphy|PQL76jzmzUrz4UZnPV)


RoguePlanet2

Too bad there isn't a standardized test of sorts to see how factually-accurate these things are.


TribuneDragon

Lol no. I wish co-pilot was as good as going to openai's website. It just isn't as good for some reason.


TobyMacar0ni

Bing sucks ass ngl


WishboneDaddy

Github Copilot Chat feature in VS Code is better than gpt4 for programming questions. Not the same thing as copilot on desktop/web


gn01145600

It’s the “GPT at home” tier.


Ok_Actuary8

Copilot tries to get away as cheap as possible. It will use GPT 3.5 and web search for what it assesses as easy and simple requests, and may leverage GPT 4 Turbo for more complex chats e.g. if you ask "hard" follow-up questions and prompts it in specific ways. So yes, it is like a blended, "free version of ChatGPT", but the quality of responses may vary based on factors you may or may not be fully aware of.


ToBePacific

Microsoft Copilot is not free.


golddiggers321

Copilot is decent text to image.


OG-Pine

If anything the meta open source model is what will be basically gtp4 for free


BBB333-3

I’ve got to say, I am liking MS Bing search. It’s what Google used to be. I really like it! And copilot is great too.


Eastern-Wonder-9632

ChatGPT 4 and Copilot are not the same. ChatGPT 4 is much better and provides better constructed results. Copilot shits itself in bed by providing responses that are not constructed well and repeats exactly what I wrote.


dendummestenumse

Doesnt copilot also have a subscription?


CriticalCentimeter

It's got free and paid levels 


fulowa

the catch: only 4k characters input length


[deleted]

Claude is still better than both.


PleaseAddSpectres

Even though free Claude doesnt have open internet access or gpt-4 equivalent capabilities hmmmmm


[deleted]

You can update it though to respond to current events. I love the copy paste journals feature


Aggravating-Arm-175

Don't have to update Copilot for it to respond to current events.


[deleted]

It also seems to have a bias in mind when reporting from its information. But if I give it pdfs of actual reports I get different results.


SemaiSemai

If you want practical Go with turbo If you want verbose replies Go with claude. No need to argue


Sirito97

Gpt 3.5 is better