This issue has been created
There are 2 updates.
 
 
LLM AI Integration / cid:jira-generated-image-avatar-a7fe31cd-3c4e-411d-ac2e-a1b8fb80dbd8 LLMAI-76 Open

Measure energy consumption for the benchmark

 
View issue   ยท   Add comment
 

Issue created

 
cid:jira-generated-image-avatar-92aadc8e-aa79-46dd-b4e1-a20fa0376892 Michael Hamann created this issue on 27/May/24 15:05
 
Summary: Measure energy consumption for the benchmark
Issue Type: cid:jira-generated-image-avatar-a7fe31cd-3c4e-411d-ac2e-a1b8fb80dbd8 New Feature
Affects Versions: 0.3.1
Assignee: Unassigned
Created: 27/May/24 15:05
Priority: cid:jira-generated-image-static-major-1d6a0d67-ca9b-4d75-9de5-34be81a5b6eb Major
Reporter: Michael Hamann
Description:

In the LLM benchmark, we need to measure energy consumption of the different tasks. For this, we should measure energy consumption on the inference server and associate this data to the different tasks we execute, depending on the running time. It seems hard to do this exactly, we should therefore probably work with average values and try to come up with some estimates of the consumed power per input and output token.

We should also compare our measurements to publicly reported performance in particular for parallel requests. When a publication reports a certain number of tokens per second on a certain GPU, we can, based on the maximum power consumption of that GPU, and the tokens per second derive a maximum of the consumed power per token.

 
 

2 updates

 
cid:jira-generated-image-avatar-92aadc8e-aa79-46dd-b4e1-a20fa0376892 Changes by Michael Hamann on 27/May/24 15:06
 
Fix Version: 0.4
Assignee: Paul Pantiru