bnew

Veteran
Joined
Nov 1, 2015
Messages
51,787
Reputation
7,926
Daps
148,627









Announcing our paper on Generative TV & Showrunner Agents! Create episodes of TV shows with a prompt - SHOW-1 will write, animate, direct, voice, edit for you. We used South Park FOR RESEARCH ONLY - we won't be releasing ability to make your own South Park episodes -not our IP!

 
Last edited:

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,787
Reputation
7,926
Daps
148,627

Apple Tests ‘Apple GPT,’ Develops Generative AI Tools to Catch OpenAI​


  • Company builds large language models and internal chatbot
  • Executives haven’t decided how to release tools to consumers
An Apple internal chatbot, dubbed by some as “Apple GPT,” bears similarities to OpenAI’s ChatGPT, pictured.

An Apple internal chatbot, dubbed by some as “Apple GPT,” bears similarities to OpenAI’s ChatGPT, pictured.Source: Bloomberg

By Mark Gurman
July 19, 2023 at 12:03 PM EDT
Updated on
July 19, 2023 at 12:15 PM EDT


Apple Inc. is quietly working on artificial intelligence tools that could challenge those of OpenAI Inc., Alphabet Inc.’s Google and others, but the company has yet to devise a clear strategy for releasing the technology to consumers.
The iPhone maker has built its own framework to create large language models — the AI-based systems at the heart of new offerings like ChatGPT and Google’s Bard — according to people with knowledge of the efforts. With that foundation, known as “Ajax,” Apple also has created a chatbot service that some engineers call “Apple GPT.”


Apple Races to Build Own Generative AI Framework

WATCH: Apple is making its own Generative AI tools.

In recent months, the AI push has become a major effort for Apple, with several teams collaborating on the project, said the people, who asked not to be identified because the matter is private. The work includes trying to address potential privacy concerns related to the technology.


Apple shares gained as much as 2.3% to a record high of $198.23 after Bloomberg reported on the AI effort Wednesday, rebounding from earlier losses. Microsoft Corp., OpenAI’s partner and main backer, slipped about 1% on the news.

A spokesman for Cupertino, California-based Apple declined to comment.

The company was caught flat-footed in the past year with the introduction of OpenAI’s ChatGPT, Google Bard and Microsoft’s Bing AI. Though Apple has woven AI features into products for years, it’s now playing catch-up in the buzzy market for generative tools, which can create essays, images and even video based on text prompts. The technology has captured the imagination of consumers and businesses in recent months, leading to a stampede of related products.

Apple has been conspicuously absent from the frenzy. Its main artificial intelligence product, the Siri voice assistant, has stagnated in recent years. But the company has made AI headway in other areas, including improvements to photos and search on the iPhone. There’s also a smarter version of auto-correct coming to its mobile devices this year.

Publicly, Chief Executive Officer Tim Cook has been circumspect about the flood of new AI services hitting the market. Though the technology has potential, there are still a “number of issues that need to be sorted,” he said during a conference call in May. Apple will be adding AI to more of its products, he said, but on a “very thoughtful basis.”

In an interview with Good Morning America, meanwhile, Cook said he uses ChatGPT and that it’s something that the company is “looking at closely.”

Behind the scenes, Apple has grown concerned about missing a potentially paramount shift in how devices operate. Generative AI promises to transform how people interact with phones, computers and other technology. And Apple’s devices, which produced revenue of nearly $320 billion in the last fiscal year, could suffer if the company doesn’t keep up with AI advances.



That’s why Apple began laying the foundation for AI services with the Ajax framework, as well as a ChatGPT-like tool for use internally. Ajax was first created last year to unify machine learning development at Apple, according to the people familiar with the effort.

The company has already deployed AI-related improvements to search, Siri and maps based on that system. And Ajax is now being used to create large language models and serve as the foundation for the internal ChatGPT-style tool, the people said.

How Does Artificial Intelligence Fit Into the Apple Empire?​


Apple is developing new AI tools, but isn’t sure how to commercialize them



Source: Apple’s 2022 annual report

The chatbot app was created as an experiment at the end of last year by a tiny engineering team. Its rollout within Apple was initially halted over security concerns about generative AI, but has since been extended to more employees. Still, the system requires special approval for access. There’s also a significant caveat: Any output from it can’t be used to develop features bound for customers.

Even so, Apple employees are using it to assist with product prototyping. It also summarizes text and answers questions based on data it has been trained with.

Apple isn’t the only one taking this approach. Samsung Electronics Co. and other technology companies have developed their own internal ChatGPT-like tools after concerns emerged about third-party services leaking sensitive data.

Read More: AI Doomsday Scenarios Are Gaining Traction in Silicon Valley

Apple employees say the company’s tool essentially replicates Bard, ChatGPT and Bing AI, and doesn’t include any novel features or technology. The system is accessible as a web application and has a stripped-down design not meant for public consumption. As such, Apple has no current plans to release it to consumers, though it is actively working to improve its underlying models.



Beyond the state of the technology, Apple is still trying to determine the consumer angle for generative AI. It’s now working on several related initiatives — a cross-company effort between its AI and software engineering groups, as well as the cloud services engineering team that would supply the infrastructure for any significant new features. While the company doesn’t yet have a concrete plan, people familiar with the work believe Apple is aiming to make a significant AI-related announcement next year.



John Giannandrea, the company’s head of machine learning and AI, and Craig Federighi, Apple’s top software engineering executive, are leading the efforts. But they haven’t presented a unified front within Apple, said the people. Giannandrea has signaled that he wants to take a more conservative approach, with a desire to see how recent developments from others evolve.



Apple Hosts Worldwide Developers Conference


Publicly, Apple’s Tim Cook has expressed caution about the flood of generative AI services hitting the market.Source: Bloomberg

Around the same time that it began developing its own tools, Apple conducted a corporate trial of OpenAI’s technology. It also weighed signing a larger contract with OpenAI, which licenses its services to Microsoft, Shutterstock Inc. and Salesforce Inc.

Apple’s Ajax system is built on top of Google Jax, the search giant’s machine learning framework. Apple’s system runs on Google Cloud, which the company uses to power cloud services alongside its own infrastructure and Amazon.com Inc.’s AWS.

As part of its recent work, Apple is seeking to hire more experts in generative AI. On its website, it is advertising for engineers with a “robust understanding of large language models and generative AI” and promises to work on applying that technology to the way “people communicate, create, connect and consume media” on iPhones and its other devices.

An ideal spot for Apple to integrate its LLM technology would be inside Siri, allowing the voice assistant to conduct more tasks on behalf of users. Despite launching in 2011, before rival systems, Siri lagged competitors as Apple focused on other areas and adopted fewer features in favor of privacy.

In his May remarks, Cook defended the company’s AI strategy, saying the technology is used across much of its product lineup, including in features like car-crash and fall detection. More recently, Cook said that LLMs have “great promise,” while warning about the possibility of bias and misinformation. He also called for guardrails and regulation in the space.



The company expanded its artificial intelligence efforts in 2018 with the hiring of Giannandrea, who previously led search and AI at Google. Since then, Apple hasn’t released many splashy new AI features, but at least two initiatives could help put it on the map.

The company is planning a new health coaching service codenamed Quartz that relies on data from an Apple Watch and uses AI to personalize plans, Bloomberg reported in April. And the company’s future electric car will use artificial intelligence to power the vehicle’s self-driving capabilities.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,787
Reputation
7,926
Daps
148,627




Researchers Chart Alarming Decline in ChatGPT Response Quality​

By Mark Tyson
published about 5 hours ago

For example, Chat GPT-4 prime number identification accuracy fell from 97.6% to 2.4% from March to June 2023.

ChatGPT quality declines

(Image credit: Future)

In recent months there has been a groundswell of anecdotal evidence and general murmurings concerning a decline in the quality of ChatGPT responses. A team of researchers from Stanford and UC Berkeley decided to determine whether there was indeed degradation and come up with metrics to quantity the scale of detrimental change. To cut a long story short, the dive in ChatGPT quality certainly wasn't imagined.

Three distinguished academics, Matei Zaharia, Lingjiao Chen, and James Zou, were behind the recently published research paper How Is ChatGPT's Behavior Changing Over Time? (PDF) Earlier today, Computer Science Professor at UC Berkeley, Zaharia, took to Twitter to share the findings. He startlingly highlighted that "GPT -4's success rate on 'is this number prime? think step by step' fell from 97.6% to 2.4% from March to June."

GPT-4 became generally available about two weeks ago and was championed by OpenAI as its most advanced and capable model. It was quickly released to paying API developers, claiming it could power a range of new innovative AI products. Therefore, it is sad and surprising that the new study finds it so wanting of quality responses in the face of some pretty straightforward queries.

We have already given an example of GPT-4's superlative failure rate in the above prime number queries. The research team designed tasks to measure the following qualitative aspects of ChatGPT's underlying large language models (LLMs) GPT-4 and GPT-3.5. Tasks fall into four categories, measuring a diverse range of AI skills while being relatively simple to evaluate for performance.

  • Solving math problems
  • Answering sensitive questions
  • Code generation
  • Visual reasoning
An overview of the performance of the Open AI LLMs is provided in the chart below. The researchers quantified GPT-4 and GPT-3.5 releases across their March 2023 and June 2023 releases.


nfJlMCN.png


(Image credit: Matei Zaharia, Lingjiao Chen, James Zou)

It is clearly illustrated that the "same" LLM service answers queries quite differently over time. Significant differences are seen over this relatively short period. It remains unclear how these LLMs are updated and if changes to improve some aspects of their performance can negatively impact others. See how much 'worse' the newest version of GPT-4 is compared to the March version in three testing categories. It only enjoys a win of a small margin in visual reasoning.


3W8Y2Mw.png


(Image credit: Matei Zaharia, Lingjiao Chen, James Zou)

Some may be unbothered about the variable quality observed in the 'same versions' of these LLMs. However, the researchers note, "Due to the popularity of ChatGPT, both GPT-4 and GPT-3.5 have been widely adopted by individual users and a number of businesses." Therefore, it isn't beyond the bounds of possibility that some GPT-generated information can affect your life.

The researchers have voiced their intent to continue to assess GPT versions in a longer study. Perhaps Open AI should monitor and publish its own regular quality checks for its paying customers. If it can't be clearer about this, it may be necessary for business or governmental organizations to keep an check on some basic quality metrics for these LLMs, which can have significant commercial and research impacts.



AI and LLM tech isn't a stranger to surprising issues, and with the industry's data pilfering claims and other PR quagmires, it currently seems to be the latest 'wild west' frontier on connected life and commerce.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,787
Reputation
7,926
Daps
148,627

Microsoft puts a steep price on Copilot, its AI-powered future of Office documents​

Microsoft 365 businesses will have to pay $30 per user per month extra to get access to Copilot.​

By Tom Warren, a senior editor covering Microsoft, PC gaming, console, and tech. He founded WinRumors, a site dedicated to Microsoft news, before joining The Verge in 2012.

Jul 18, 2023, 11:30 AM EDT

Illustration of Microsoft’s new AI-powered Copilot for Office apps

Image: Microsoft

Microsoft is putting a price on the AI-powered future of Office documents, and it’s a steep one for businesses looking to adopt Microsoft’s latest technology. Microsoft 365 Copilot will be available for $30 per user per month for Microsoft 365 E3, E5, Business Standard, and Business Premium customers.

That’s a big premium over the cost of the existing Microsoft 365 plans right now. Microsoft charges businesses $36 per user per month for Microsoft 365 E3, which includes access to Office apps, Teams, SharePoint, OneDrive, and many other productivity features. A $30 premium for access to Microsoft 365 Copilot will nearly double the cost for businesses subscribed to E3 that want these AI-powered features. For Microsoft 365 Business Standard, that’s almost three times the cost, given that it’s $12.50 per user per month.



Copilot can appear in Word to generate text or alter paragraphs.


Copilot can appear in Word to generate text or alter paragraphs. Image: Microsoft

Microsoft is trying to overhaul its Office apps with its AI-powered Copilot service, allowing businesses to instantly summarize documents, generate emails, and speed up Excel analysis. Microsoft 365 Copilot certainly looks like a very compelling feature addition, and I genuinely believe it will change Office documents forever, but the cost could put a lot of existing Microsoft 365 businesses off adopting Copilot in the short term.

Around 600 enterprise customers have been testing Microsoft 365 Copilot during a paid early access program over the past several months. Companies like KPMG, Lumen, and Emirates NBD have all had access. “We’re learning that the more customers use Copilot, the more their enthusiasm for Copilot grows,” says Yusuf Mehdi, Microsoft’s head of consumer marketing, in a blog post today. “Soon, no one will want to work without it.”

Microsoft hasn’t put a release date on Microsoft 365 Copilot just yet, though. The software giant will face competition from Google, too. Microsoft’s Copilot announcement came just days after Google announced similar AI features for Google Workspace earlier this year, including AI-assisted text generation in Gmail, Docs, and more. Zoom and Salesforce have also been adding AI-powered features, so all eyes will now be on how Google, Zoom, and Salesforce handle pricing for their AI additions going forward.



Copilot can handle generating long and short emails, too.


Copilot can handle generating long and short emails, too. Image: Microsoft

Part of the reason why Microsoft 365 Copilot is priced highly is because of the investment Microsoft has been making in building out its AI-powered offerings. Microsoft has invested billions into its OpenAI partnership to get this all off the ground. Tech companies like Microsoft have also been scrambling for Nvidia GPUs to power these features, so there’s a premium on what tasks this infrastructure is thrown at until chip availability and costs come down. Microsoft is reportedly working on its own AI chips in an attempt to avoid a costly reliance on Nvidia.

Microsoft is also bringing this Copilot experience to Teams, with integration into the Teams phone calling experience and inside Teams Chat threads. You can read more about these new Microsoft Teams Copilot features here.

Alongside the pricing announcement, Microsoft is also launching Bing Chat Enterprise. It’s essentially the same Bing Chat that’s available to consumers but with added commercial data protection. Microsoft is rolling out a preview of this today, and it’s included at no additional cost in Microsoft 365 E3, E5, Business Standard, and Business Premium. You can read more about Bing Chat Enterprise right here.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,787
Reputation
7,926
Daps
148,627


118,770 views Jul 11, 2023 #BBCNews #AI
Scientists are harnessing the power of artificial intelligence (AI) to decode animal languages. Scientists in Israel are closer than ever to making two-way communication with another species more likely - by using AI to understand the language of bats. “I’ve always dreamt of a Dolittle machine that will allow me to talk with animals”, said Professor Yossi Yovel of Tel Aviv University. The team at Tel Aviv University created a large database of bat noises and videos, teaching AI how to differentiate between the different sounds. The Algorithm then matched the sounds with specific social interactions captured on film. Adi Rachum from Tel Aviv University said, “in the end the computer will be able to speak the language, to understand what they say to each other”.
 

Rembrandt

the artist
Joined
Jan 13, 2016
Messages
13,673
Reputation
1,315
Daps
37,262
Reppin
Villa Diodati

Microsoft puts a steep price on Copilot, its AI-powered future of Office documents​

Microsoft 365 businesses will have to pay $30 per user per month extra to get access to Copilot.​

By Tom Warren, a senior editor covering Microsoft, PC gaming, console, and tech. He founded WinRumors, a site dedicated to Microsoft news, before joining The Verge in 2012.

Jul 18, 2023, 11:30 AM EDT

Illustration of Microsoft’s new AI-powered Copilot for Office apps

Image: Microsoft

Microsoft is putting a price on the AI-powered future of Office documents, and it’s a steep one for businesses looking to adopt Microsoft’s latest technology. Microsoft 365 Copilot will be available for $30 per user per month for Microsoft 365 E3, E5, Business Standard, and Business Premium customers.

That’s a big premium over the cost of the existing Microsoft 365 plans right now. Microsoft charges businesses $36 per user per month for Microsoft 365 E3, which includes access to Office apps, Teams, SharePoint, OneDrive, and many other productivity features. A $30 premium for access to Microsoft 365 Copilot will nearly double the cost for businesses subscribed to E3 that want these AI-powered features. For Microsoft 365 Business Standard, that’s almost three times the cost, given that it’s $12.50 per user per month.



Copilot can appear in Word to generate text or alter paragraphs.


Copilot can appear in Word to generate text or alter paragraphs. Image: Microsoft

Microsoft is trying to overhaul its Office apps with its AI-powered Copilot service, allowing businesses to instantly summarize documents, generate emails, and speed up Excel analysis. Microsoft 365 Copilot certainly looks like a very compelling feature addition, and I genuinely believe it will change Office documents forever, but the cost could put a lot of existing Microsoft 365 businesses off adopting Copilot in the short term.

Around 600 enterprise customers have been testing Microsoft 365 Copilot during a paid early access program over the past several months. Companies like KPMG, Lumen, and Emirates NBD have all had access. “We’re learning that the more customers use Copilot, the more their enthusiasm for Copilot grows,” says Yusuf Mehdi, Microsoft’s head of consumer marketing, in a blog post today. “Soon, no one will want to work without it.”

Microsoft hasn’t put a release date on Microsoft 365 Copilot just yet, though. The software giant will face competition from Google, too. Microsoft’s Copilot announcement came just days after Google announced similar AI features for Google Workspace earlier this year, including AI-assisted text generation in Gmail, Docs, and more. Zoom and Salesforce have also been adding AI-powered features, so all eyes will now be on how Google, Zoom, and Salesforce handle pricing for their AI additions going forward.



Copilot can handle generating long and short emails, too.


Copilot can handle generating long and short emails, too. Image: Microsoft

Part of the reason why Microsoft 365 Copilot is priced highly is because of the investment Microsoft has been making in building out its AI-powered offerings. Microsoft has invested billions into its OpenAI partnership to get this all off the ground. Tech companies like Microsoft have also been scrambling for Nvidia GPUs to power these features, so there’s a premium on what tasks this infrastructure is thrown at until chip availability and costs come down. Microsoft is reportedly working on its own AI chips in an attempt to avoid a costly reliance on Nvidia.

Microsoft is also bringing this Copilot experience to Teams, with integration into the Teams phone calling experience and inside Teams Chat threads. You can read more about these new Microsoft Teams Copilot features here.

Alongside the pricing announcement, Microsoft is also launching Bing Chat Enterprise. It’s essentially the same Bing Chat that’s available to consumers but with added commercial data protection. Microsoft is rolling out a preview of this today, and it’s included at no additional cost in Microsoft 365 E3, E5, Business Standard, and Business Premium. You can read more about Bing Chat Enterprise right here.

Yikes. Google is doing it for workspace accounts for free. Already integrated into Gmail, docs and sheets.

This should be interesting, especially with bard progressing pretty well
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,787
Reputation
7,926
Daps
148,627

MDUZ8Ds.png

Run Llama-2 Locally in 7 Lines!​

For Apple Silicon Mac




TLDR


Code:
xcode-select --install # Make sure git & clang are installed
git clone GitHub - ggerganov/llama.cpp: Port of Facebook's LLaMA model in C/C++
cd llama.cpp
curl -L https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_K_M.bin --output ./models/llama-2-7b-chat.ggmlv3.q4_K_M.bin
LLAMA_METAL=1 make
./main -m ./models/llama-2-7b-chat.ggmlv3.q4_K_M.bin -n 1024 -ngl 1 -p "Give me 5 things to do in NYC"


NOTE: The 7B model weights are about 4GB in size, please ensure you have enough space on your machine.

What is this doing?​

This is using the amazing llama.cpp project by Georgi Gerganov to run Llama 2. It downloads a 4-bit optimized set of weights for Llama 7B Chat by TheBloke via their huggingface repo here, puts it into the models directory in llama.cpp, then builds llama.cpp with Apple’s Metal optimizations. This allows you to run Llama 2 locally with minimal work. The 7B weights should work on machines with 8GB of RAM (but better if you have 16GB). Larger models like 13B or 70B will require significantly more RAM. Note that Llama 2 non-chat weights are also available here: TheBloke/Llama-2-7B-GGML · Hugging Face, however if you want a simpler chat interface, the chat weights are preferred.

Performance​

On an M2 Max MacBook Pro, I was able to get 35–40 tokens per second using the LLAMA_METAL build flag. Your performance may vary depending on Apple Silicon chip.

Thanks and acknowledgements​

It’s amazing that we’re able to get state of the art large language models running locally in such a short amount of time after their release. This wouldn’t be possible without the Llama 2 weights being open sourced by Meta, the llama.cpp project and TheBloke’s huggingface optimized Llama 2 model weights.

LastMile AI​

This is an AI workbook by LastMile AI. This is not a static page but an interactive AI workbook AI workbooks allow you to play with generative text, image, and audio AI models. Here’s an example of a ChatGPT cell.


CHATGPT
What does this code do? xcode-select --install # Make sure git & clang are installed git clone GitHub - ggerganov/llama.cpp: Port of Facebook's LLaMA model in C/C++ cd llama.cpp curl -L https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_K_M.bin --output ./models/llama-2-7b-chat.ggmlv3.q4_K_M.bin LLAMA_METAL=1 make ./main -m ./models/llama-2-7b-chat.ggmlv3.q4_K_M.bin -n 1024 -ngl 1 -p "Give me 5 things to do in NYC"
Show more

This code performs the following actions:
  1. Installs git and clang (if not already installed).
  2. Clones the llama.cpp repository from GitHub.
  3. Navigates to the cloned repository.
  4. Downloads a pre-trained language model file (llama-2-7b-chat.ggmlv3.q4_K_M.bin) from the Hugging Face model hub and saves it in the "models" directory within the cloned repository.
  5. Sets the environment variable "LLAMA_METAL" to 1.
  6. Executes the "make" command to compile the code.
  7. Runs the compiled program "main" with the specified command-line arguments: using the downloaded language model file (-m), generating 1024 tokens (-n), using 1 generated line of text (-ngl), and providing the prompt "Give me 5 things to do in NYC" (-p).
Overall, this code sets up and runs a chatbot program that uses the LLAMA language model to generate text based on the given prompt.


Pretty cool? It’s free to use AI workbooks for personal use cases. Get started here!
 
Last edited:

bnew

Veteran
Joined
Nov 1, 2015
Messages
51,787
Reputation
7,926
Daps
148,627
Top