Osmosis Ai has open source Osmose-apply-1.7bA refined variant of Qwen3-1.7b, designed to carry out very precise and structured code merger tasks. Ideas IDE as “Instant Apply” of Cursor, the OSMOSIS-APPLY-1.7B is optimized for code modifications at the context and at the level of the function. The model performs solid performance with fewer parameters compared to much larger foundation models by taking advantage of the code -specific formatting beacons, a high -quality data set and integration of the model context protocol (MCP).
Built for code merger tasks
Unlike the LLM for general use which fight with the diff and semantic fusion, osmose-Apply-1.7B is formed specifically to apply structured modifications in terms of function or block. The model takes three structured inputs: (1) The original code, (2) all of the modifications or diffs, and (3) the expected merger format. It then returns a revised code block where the modification is applied inside
Tags nested in a block. This format aligns with production-grade expectations and simplifies validation.
Training and reward structure
OSMOSE-APPLY-1.7B has been refined on approximately 100,000 real commitments to the commissioner data setrepresenting less than 15% of the complete corpus. Each training sample has been structured to represent the practical workflows of developers. A post-training system based on the award has been used:
- Full match (including formatting): award = 1.0
- Semantic match (ignore the empty lines): award = 0.2
- Incorrect or stranded correspondence: reward = 0.0
This reward diagram strengthens high fidelity outings while allowing a certain leniency in the stylistic variation, closely imitating the functioning of code journals in practice.
Reference results
OSMOSE AI compared the model using an evaluation of 10,000 samples from the commissioner data set. The average reward scores demonstrate high performance compared to the larger LLM:
Model | Reward |
---|---|
Osmose-apply-1.7b | 0.9805 |
Claude 4 SONNET | 0.9328 |
GPT-3.5-Turbo | 0.8639 |
Gemini-2.5-Flash | 0.7745 |

These results highlight the strength of the model in the application of localized changes while preserving semantics, formatting and structure.
MCP integration for developers' work flows
A key characteristic of the model is its native support for the Model context protocol (MCP)Activating the invocation of the structured context with file hierarchies, function names and modification tags. The model adheres to apply-code
MCP SPE, allowing transparent use in CLI tools and IDE agents. It returns changes in the level of function and marks the changes using well -structured XML style tags, which simplifies diff and downstream monitoring.
Developer tools and use cases
Osmosis AI has also published a reference implementation which supports both local inference and integration with services such as VLLM or Gulp Server. The tools includes examples of use -based use, implementation of the MCP server and safe deployment guides.
Key use cases include:
- IDE agents offering an “instant application” for the changes specified by the user
- Ci bots applying modifications based on an automatic refector or a review
- Data generation pipelines for downstream adjustment
- Code transformation tools with fused structure logic
Format and deployment
Edicts of outputs of the wrapped model and
Tags to ensure compatibility with automated validators. The versions of the model's inference are provided in several formats, in particular safetensors
And GGUF
for effective deployment. OSMOSE-APPLY-1.7B can be accommodated locally or served in quantified mode for optimized inference on constrained equipment.
Availability and license
The Osmosis-APPLY-1.7B is available under the APACHE-2.0 license and hosted on both Face And Github. The version includes all the scripts necessary for inference, examples of deployment in accordance with MCP and structured formatting guides.
Conclusion
By Open-Source-Applly-1.7b osmosis, OSMOSE AI meets a key need for code editing models at the function and structure. Unlike foundation models, this specialized model combines compact size with the precision and alignment of the format. Its MCP integration, its fine refineeur based on rewards and its syntactic structure support make it an ideal candidate for the tools of real developers.
Discover the GitHub page,, Strengthered facial page And Technical details. All the merit of this research goes to researchers in this project. Also, don't hesitate to follow us Twitter,, YouTube And Spotify And don't forget to join our Subseubdredit 100k + ml and subscribe to Our newsletter.
Asif Razzaq is the CEO of Marktechpost Media Inc .. as a visionary entrepreneur and engineer, AIF undertakes to exploit the potential of artificial intelligence for social good. His most recent company is the launch of an artificial intelligence media platform, Marktechpost, which stands out from its in-depth coverage of automatic learning and in-depth learning news which are both technically solid and easily understandable by a large audience. The platform has more than 2 million monthly views, illustrating its popularity with the public.
