Skip to content

Conversation

0xpeluche
Copy link
Contributor

No description provided.

@0xpeluche 0xpeluche self-assigned this Sep 15, 2025
const batch: any = await Tables.DIMENSIONS_DATA.findAll({
where: { type: adapterType, updatedat: { [Op.gte]: timestamp * 1000 } },
attributes: ['data', 'timestamp', 'id', 'timeS'],
attributes: ['data', 'timestamp', 'id', 'timeS', 'bl', 'blc'],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

skip blc

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

blc are being read, but as long as this variable is present DIM_STORE_BLC === 'false' in the code, they are not stored in the common cache file. This allows for an easy switch if one day you want to enable them, but if you prefer I can completely remove blc related logic so that it no longer appears in the code at all

const batch: any = await Tables.DIMENSIONS_DATA.findAll({
where: filterCondition,
attributes: ['data', 'timestamp', 'id', 'timeS'],
attributes: ['data', 'timestamp', 'id', 'timeS', 'bl', 'blc'],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

skip blc

}


const STORE_BLC = process.env.DIM_STORE_BLC === 'false'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

try pulling this from the env.ts file, this way, we have a record of all the flags that we use

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

on second thought, can you remove all blc related code for now? also, and summary breakdown code as well, lets focus only on financial statements, so, only for fee data:

  • we aggregate all data of given metric by month & by quarter. (df, dr, dssr, dhr, etc), also for the labels if present by metic
  • add a smart logic that adds missing data where possible, like if df & dr are present, show aggregated dssr as df - dr (and other way around), same for (dhr = dr - dssr)
  • add breakdown labels as part of final output file
  • probably, best to generate parent protocol data after the child data is generated, easier to read and merge aggregated child data?

router.get("/overview/:type", ew(getOverviewFileRoute))
router.get("/overview/:type/:chain", ew(getOverviewFileRoute))
router.get("/summary/:type/:name", ew(getDimensionProtocolFileRoute))
router.get("/summaryBreakdown/:type/:name", ew(getDimensionBreakdownProtocolFileRoute))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what I would do here is, instead of adding these under normal router, will create a new pro api router and add it there

Copy link
Member

@g1nt0ki g1nt0ki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.

return filename
}

function findBuildDir(): string | null {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, this is the wrong approach, if you want to read a generated file, you should use readRouteFile

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

*readRouteData


for (const { abbr, dir } of recordTypeDirs) {
let files: string[] = []
try { files = fs.readdirSync(dir) } catch { files = [] }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, you can get this from 'fees/df-lite' and read protocol list from there? or, get fee protocol adapter list and try to read each adapter data from there?

}

export function setProRoutes(router: HyperExpress.Router, _routerBasePath: string) {
router.get("/summaryBreakdown/:type/:name", ew(getDimensionBreakdownProtocolFileRoute))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you change this to /v2/financial-statement/:name, for now, lets restrict to aggregated monthly/quartly of combined fee data

router.get("/aggregates/:type/:name", ew(getAggregatesProtocolFileRoute))
router.get("/aggregatesBreakdown/:type/:name", ew(getAggregatesBreakdownProtocolFileRoute))
}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we combine these two to /v2/metrics/:type/:name?breakdown=true|false
adding versioning to make it easy for us to deprecate api in the future, and breakdown as request parameter to cut down on the number of exposed routes

const data = await readRouteData(routeFile)
if (!data) return errorResponse(res, errorMessage)

const response: any = { ...data }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to clone since we are reading from file each time?


const filteredMetrics: any = {}
Object.keys(data.metrics || {}).forEach(key => {
const isBreakdownMetric = key.endsWith('bl')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, need to check how the file is generated, my idea with files was, they can more or less streamed as response without any processing


response.metrics = filteredMetrics

const cleanResponse = JSON.parse(JSON.stringify(response))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not needed?

Comment on lines 268 to 269
export async function getDimensionBreakdownProtocolFileRoute(req: HyperExpress.Request, res: HyperExpress.Response) {
const protocolName = req.path_parameters.name?.toLowerCase()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you add validation that adapterType is fee here, and since it is monthly/quarterly report, I want it to include all datatypes, this way, when the front-end shows financial report of the given protocol it fetches everything in a single call

if (!src) return
target.metadata = target.metadata || {}

const KEYS = [
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, this exact list is used somewhere else, better to move that list to the top and this way, it is not duplicated


Object.entries(bl).forEach(([label, v]: any) => {
const n = +v || 0
byLabel[label] = (byLabel[label] ?? 0) + n
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if a value is 0 shouldnt we skip that label entry? and after this step, if the entire byLabel empty, shouldnt we skip that datapoint?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am in two minds how to handle the parent protcol labels, it has the aggregated protocol data, did you add logic to merge bl field as well?

})

// backfill full history ONLY for protocols that newly emitted bl and have not been backfilled
const toBackfill = Array.from(protocolsWithBlDataInRecentUpdates)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, it is much simpler to reset the cache once than do this, please remove this block of code


const recordTypeDirs = fs
.readdirSync(adapterPath)
.filter((d) => /[a-z]+protocol$/i.test(d))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tbh, I really dont like this flow.

Probably good seperation of code, makes it easier to read, disable only aggregated part, but more efficient to do it in one place instead of writing to a file and then reading it. We can use a flag like generateBreakdown to enable/disable certain code block

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you move the logic here to addProtocolData, i.e compute both the breakdown data & montly/quartly aggregation like how we do chain aggregation there.

then in the file generation state (where buildLabelChartsFromRecords is run atm), we save this pregenerated data as file

protocol.info = { ...(tvlProtocolInfo ?? {}), };
protocol.misc = { };
const infoKeys = ['name', 'defillamaId', 'displayName', 'module', 'category', 'logo', 'chains', 'methodologyURL', 'methodology', 'gecko_id', 'forkedFrom', 'twitter', 'audits', 'description', 'address', 'url', 'audit_links', 'cmcId', 'id', 'github', 'governanceID', 'treasury', 'parentProtocol', 'previousNames', 'hallmarks', 'defaultChartView']
protocol.misc = { versionKey: info.versionKey }; // TODO: check, this is not stored in cache correctly and as workaround we are storing it in info object
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have removed versionKey, this should be changed to protocol.misc = {}

- dr = df - dssr, if dr missing and df & dssr available
- dhr = dr - dssr, if dhr missing and dr & dssr available
*/
function fillDerivedMetrics(proto: AggregatedProtocolOut) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

imo, this refilling should be done when we are pulling data from db and before it gets added to cache, I would add a new field to protocol cache, hasBreakdownData, so if that flag is true or any of the new items have bl field, run this logic to auto add missing fields

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants