Global economic instability is making it increasingly challenging for banks to manage risk. In a recent report, the Basel Committee on Banking Supervision—responsible for setting global regulatory standards—indicated that liquidity rules need to be adjusted in the wake of last year's collapse of Silicon Valley Bank and Credit Suisse. In this context, the role of specialists in the field is becoming increasingly important.
Dana Kaumenova is a liquidity valuation expert and Vice President at Citigroup. In her role, Kaumenova oversees risk management within the Liquid and Readily Marketable framework—a methodology developed by Citi to comply with Basel III capital requirements. Previously, she served as a Senior Analyst at Novantas, a consulting firm, where she conducted market research and competitive analytics and developed strategic recommendations based on data from over 40,000 banking customers.
In this interview, Dana discussed the challenges banks face today and shared insights about her most significant projects in liquidity management and market research.
What is your educational background?
I earned a bachelor's degree in business management and received a Master's in Engineering Management & Entrepreneurship from Brown University. Brown School of Engineering is one of the best in the world, and I was drawn to this program for its interdisciplinary approach, blending finance, strategy, marketing, globalization, and management.
Plus, the program allowed me the opportunity to work with the latest technologies being developed in Brown's engineering labs and to network with startup founders and venture capital firms. For example, we assisted entrepreneurs in conducting market research, identifying areas for development, and crafting business plans and marketing strategies.
What motivated you to pursue a career in consulting after completing this program?
After graduation, I sought a field where I could interact with diverse clients, solve a variety of problems, and apply my analytical skills. Consulting seemed like the perfect fit because it combines strategic thinking, analytics, and flexibility in approach.
That's how I found my way to Novantas. The company drew me in with its unique blend of consulting and fintech. Over the past six years, the marketing division—heavily reliant on market analysis—has grown tenfold. I'm also proud that the research I established continues to support the company's efforts.
What projects have you worked on at Novantas?
I worked on several significant projects in data analytics, market research, and strategy development. One of the key initiatives was Bank Choice Monitor, a tool that delivered competitive analytics to our clients on a quarterly basis. This tool was utilized by one of the largest national banks in the U.S., as well as several top regional banks. I spent about a year overseeing the entire process, from data collection to analysis and presentation of results. I also introduced a research component that included insights on market trends and consumer preferences, adding an innovative touch to the project.
When I joined the company, clients received a standardized report on competitors, which included data on products, financials, and other metrics. However, I noticed that it was missing more general insights about consumer preferences and market trends that could help clients make informed decisions. To address this gap, I began conducting consumer surveys and analyzing data to identify which key features of deposit products were important to potential users. This allowed me to provide our clients with actionable insights and recommendations.
As a result, Bank Choice Monitor (BCM) became an important tool for our clients, offering them not only competitor analytics but also an in-depth understanding of their target consumers' behavior. This data helped our clients make informed, data-driven strategic decisions, ultimately increasing their competitiveness.
What sort of insights have you shared with your clients?
For one client, I provided a comparative review of checking accounts and related fees in key markets. The analysis revealed that while our client's monthly account maintenance fees were competitive, their overdraft fees were well above those of their peers.
This finding is particularly important, as consumers in the U.S. spend more than $30 billion annually on fees for insufficient funds. We found that 16% of bank shoppers cited fees as the primary reason for switching banks, and 20% of respondents indicated that having overdraft protection was a must-have feature—up from 18% a few years earlier. Based on this insight, we advised the client to adjust their product strategy accordingly.
We also found that in the wake of the global pandemic, digital capabilities became much more important to consumers when choosing a bank, overshadowing factors like proximity to a branch or ATMs. Banks that lagged behind competitors in mobile app development faced serious difficulties in retaining existing users and attracting new ones.
What prompted your transition from consulting to liquidity assessment?
Although my main focus at Novantas was not on liquidity and banking risks, I had already acquired a foundational knowledge of these topics through my work with financial institutions. I was very interested in this area.
I developed a deep understanding of this area through my experience at Citi. The key to success in my new position was less about my specific expertise and more about my technical, analytical, and soft skills, which I honed during my time in consulting. In particular, I learned to take a broader perspective on problems and contextualize them.
One of the most valuable lessons I learned was the importance of asking, "so what?" This ability to continually question and seek answers has allowed me to transform complex data into practical, actionable insights.
What is your current role at Citi?
I started as an Assistant Vice President and, over the past three years, have progressed to the position of Vice President. In this position, I am responsible for risk management and data within the Liquid and Readily Marketable framework. The results of my work are used by a wide variety of teams.
In particular, I lead the implementation of projects aimed at improving operational and data efficiency. This involves communicating with various departments to analyze their needs and requirements, identifying inefficiencies in processes, and developing solutions. I translate these solutions into detailed technical specifications for developers, manage project deadlines, test new products, and train users to use them. In essence, I am responsible for the entire project cycle.
Critical thinking is essential in my work. It is necessary to see problems not just from a short-term perspective but to adopt a holistic approach that seeks long-term solutions rather than temporary fixes. I strive to think strategically and create products that can be expanded and used in the future without revisions.
In my position, it is crucial to translate complex technical and analytical concepts into understandable language for non-technical audiences and vice versa. This ability has helped to ensure smooth communication between different departments on more than one occasion.
Can you explain the Liquid and Readily Marketable framework you work with?
The LRM framework is a system developed by Citi to ensure compliance with Basel III capital requirements set by the Basel Committee on Banking Supervision. It focuses on assessing the liquidity of assets used as collateral, which is critical for managing the risks associated with financing transactions with institutional clients and investors.
The primary purpose of the LRM framework is to calculate whether a bank has access to highly liquid assets that can be quickly sold in the market without incurring significant losses, particularly in the event of unforeseen market shocks. This capability helps minimize bankruptcy risks and ensures the stability of the financial system.
As part of the LRM framework, Citi assesses the liquidity and market availability of collateral. This allows for more accurate calculation and optimization of key indicators, such as internal Pre-Settlement Risk, Margin Period of Risk, and Capital Adequacy Ratios.
What projects have you implemented as a VP?
One of the main accomplishments is the development and implementation of a near real-time liquidity assessment process for fixed-income and equity products.
Prior to this project, assessment results were available to traders only through a centralized dashboard with a two-day delay. Traders had to manually search for the ISIN (International Securities Identification Number), download the data, and integrate it into their analyses. Any new securities were queued up, and traders received results within 24 hours.
This delay prevented traders from accurately evaluating and pricing deals in real-time, leading to suboptimal valuations, erroneous RWA (Risk-Weighted Assets) projections, and potentially negative impacts on profitability.
I significantly transformed the process by implementing a parallel calculation tool that works in near real-time for new assets. Users can now request a liquidity assessment for securities and receive results in just 30 minutes. If the assessment exists in the system within the last 23 business days, the results are available instantly. This enhancement provides traders with timely and accurate data, enabling them to make more informed decisions, reduce risks, and increase profitability.
My team and I also greatly improved the user experience. Users now receive calculation results via email with an Excel attachment, eliminating the need to go to the dashboard for information. Since most financial analyses are done in Excel, this innovation has proven extremely convenient, saving users a lot of time and simplifying their workflow.
What other initiatives have you implemented?
One of my main achievements would be the creation of data quality alerts within the liquidity calculation process. Previously, issues related to completeness, accuracy, and timeliness of information were identified reactively.
My team and I developed automated alerts that made it possible to identify and eliminate problems at an early stage. For example, some municipal bonds, by their nature, should not be illiquid. The calculation system must account for this. If such a bond receives an illiquidity flag, a notification is sent to all interested parties, allowing for immediate analysis of the problem. This could indicate a technical processing error within or between different systems, or it might relate to a specific corporate action issue (for instance, if the company is going through a default).
As a result of this project, data accuracy increased to 97%, and the time taken to identify and solve data quality problems was greatly reduced. Ultimately, this is essential for accurate decision-making, regulatory compliance, and reporting.
As an expert in liquidity management, what challenges do banks face in this area today?
First and foremost, market volatility. Liquidity risks are growing due to constant and unpredictable changes in the size and depth of the market. Volatility also affects the valuation of assets used as collateral—if the price of securities falls sharply, banks may be forced to allocate more capital to meet regulatory requirements.
In such an environment, it is important not only to ensure adequate capital ratios but also to effectively manage interest rate fluctuations, which can further adversely affect liquidity. Thus, liquidity risk management in the context of market volatility requires a high degree of flexibility from banks and the ability to quickly adapt to changes in financial markets.
Secondly, banks today must navigate complex regulatory frameworks across different markets and jurisdictions to properly build capital management strategies. As these requirements continue to tighten, financial organizations face an increasing need for stricter control over their assets. A prime example is the emergence of new standards such as Basel III.
Thirdly, as banks diversify their sources of funding, managing their assets and associated risks becomes more difficult.
Finally, a new challenge in recent years has been the integration of ESG (environmental, social, and governance) principles into existing risk management frameworks. Banks are now responsible not only for financial performance but also for their impact on ESG factors. They need to allocate capital in a way that supports sustainable practices while meeting regulatory requirements.
What are you focusing on now, and how do you plan to develop further?
I am currently studying for the CAIA (Chartered Alternative Investment Analyst) certification exam. I already hold the FMVA (Financial Modeling & Valuation Analyst) certification, which provided me with a solid foundation in financial modeling, budgeting, forecasting, and general accounting and finance principles. I pursued the FMVA certification initially with the intention of entering the mergers and acquisitions field, but my career path took a different direction.
The CAIA certification is more aligned with what I am doing now. It will deepen my knowledge of alternative investments, including hedge funds, private equity, real estate, and other non-standard asset classes. This knowledge will improve my understanding of market trends and risk management in these areas, which will expand my professional opportunities in the future. In addition, I realize that it will help me make more informed investment decisions when managing my own assets.
I am also actively developing my skills in Python, which is one of the most popular tools for analyzing data, automating processes, and developing models in the financial industry. Strong programming skills allow me to analyze large amounts of data more efficiently, optimize calculations, and automate routine processes.
This capability is essential, especially as the focus on liquidity and capital issues is expected to increase in the future. Global economic conditions are becoming increasingly volatile, and the demands for transparency and reporting are growing. My work will be even more relevant, with a continued emphasis on process optimization, increasing transparency and data quality, and reducing operational risks.