How to use LLama2 in Magento

Have you ever thought about talking to an intelligent computer? Or maybe you are interested in creating innovative projects based on Magento with AI LLM LLama2? Well, check out LLama2! It is the latest highly intelligent computer program developed by Meta AI, a division of Meta (formerly known as Facebook).

What is LLama2?

Llama 2 is part of a group of smart AIs called Large Language Models (LLM). These machines can create and understand human language. They get really smart with huge amounts of text from various places like books, websites and social media. This helps them learn how words and sentences work in different situations and topics.

How can we use it and its installation process?

There are multiple ways to use LLama2, we discuss using it on a local system or on a server located in the cloud because we can use ollama.

About olama

Ollama is a handy tool that allows you to use large open source language models (LLM) directly on your computer. It works with different models like Llama 2, Code Llama, etc. It collects everything these models need – like their settings, data and code – into one handy package called Modelfile. Ollama simplifies autonomous learning and adaptation, reducing the need for extensive manual reprogramming and fine-tuning of AI systems. We can use this to create various applications such as natural language processing and image recognition.

Be Assembly:

For Windows:

https://ollama.com/download/windows

For Mac:

https://ollama.com/download/mac

For Linux:

Install it using a single command:

curl -fsSL https://ollama.com/install.sh | sh

Ollama supports model list available at https://ollama.com/library

A few models are listed below.

Lama2

Note that the 7B needs at least 8GB of RAM to run, while the 13B needs 16GB and the 33B needs 32GB.

After installing ollam, you can easily retrieve any of the models using a simple drag command.

ollama pull llama2

After the pull command is complete, you can run it directly in the terminal to generate text.

For example.

llama run llama2

Ollama also provides a REST API for running and managing models.

For endpoints, see the API documentation.

How can we use it in Magento?

As a shopping assistant:

With Llama-2 you can develop chat bots that are able to talk naturally with people. You have the flexibility to customize these chatbots to fit your specific niche, personality, and tone.

To automatically generate product content:
Using Llama-2, you can generate product content based on product name, SKU or other attributes according to your needs.

eg using following method you can generate product content based on product name and set it in product description as your need.

<?php
namespace Webkul\Ollama\Model;

class AIContent

    /**
     * @var \Magento\Framework\HTTP\Client\Curl
     */
    protected $curl;

    /**
     * @param \Magento\Framework\HTTP\Client\Curl $curl
     */
    public function __construct(
        \Magento\Framework\HTTP\Client\Curl $curl
    ) 
        $this->curl = $curl;
    

    /**
     * Get AI content
     *
     * @param string $productName
     * @return string
     */
    public function getAIContent($productName)
    
        $this->curl->addHeader('content-type', 'application/json');
        $arr = [
            CURLOPT_RETURNTRANSFER => true,
            CURLOPT_ENCODING => '',
            CURLOPT_MAXREDIRS => 10,
            CURLOPT_TIMEOUT => 0,
            CURLOPT_FOLLOWLOCATION => true,
            CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
            CURLOPT_POSTFIELDS =>'"model":"llama2","prompt":"Product Specification and description for "'.$productName.',"stream": false'
        ];
        
        $this->curl->setOptions($arr);
        //Here we pass Ollama API end point.
        $this->curl->post('http://XXX.XXX.XX.XXX:11434/api/generate', []);
        $productContent = json_decode($this->curl->getBody(), true);
        return $productContent['response'];
    

Thank you !!

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *