
microsoft/phi-1_5 · Hugging Face
The language model Phi-1.5 is a Transformer with 1.3 billion parameters. It was trained using the same data sources as phi-1, augmented with a new data source that consists of various NLP …
AI Model Catalog | Microsoft Foundry Models
The language model Phi-1.5 is a Transformer with 1.3 billion parameters. It was trained using the same data sources as phi-1 , augmented with a new data source that consists of various NLP …
In a nutshell we build phi-1.5, a 1.3 billion parameter model trained on a dataset of 30 billion tokens, which achieves common sense reasoning benchmark results comparable to models …
Textbooks Are All You Need II: phi-1.5 technical report
Dec 11, 2023 · The language model phi-1.5 is a Transformer with 1.3 billion parameters. It was trained using the same data sources as phi-1, augmented with a new data source that consists …
Phi-1.5: Specifications and GPU VRAM Requirements
Microsoft's Phi-1.5 is a Transformer-based language model containing 1.3 billion parameters. It was developed to continue the investigation into the capabilities of smaller language models, …
Phi-1.5 Model: A Case of Comparing Apples to Oranges
Microsoft introduced Phi-1.5, a 1.3 billion parameter LLM, predominantly trained on 30 billion tokens of synthetic "textbook-like" data. The model's performance on common sense …
Phi-1.5: Synthetic Data LLM - emergentmind.com
Phi-1.5’s architecture is identical to phi-1. It consists of 24 Transformer layers, each employing 32 attention heads (each of dimension 64), yielding a total model size of 1.3B parameters.
Phi 1.5 - Introduction and Analysis - DebuggerCafe
Jun 24, 2024 · Phi 1.5 follows the same path, albeit with small changes to the dataset curation and size. Phi 1.5 is a 1.3 billion parameter model (just like Phi 1). The architecture remains …
transformers/docs/source/en/model_doc/phi.md at main - GitHub
Phi is a 1.3B parameter transformer model optimized for Python code generation. It focuses on "textbook-quality" training data of code examples, exercises and synthetic Python problems …
microsoft/phi-1_5 · Model Database
phi-1.5 can write poems, draft emails, create stories, summarize texts, write Python code (such as downloading a Model Database transformer model), etc. Given the nature of the training data, …