Category Archives: Programming languages


Reading Time: 5 minutes


Crypto assets are no longer on the fringe of the financial system.

The market value of these novel assets rose to nearly $3 trillion in November from $620 billion in 2017, on soaring popularity among retail and institutional investors alike, despite high volatility. This week, the combined market capitalisation had retreated to about $2 trillion, representing an almost four-fold increase since 2017.

Amid greater adoption, the correlation of crypto assets with traditional holdings like stocks has increased significantly, which limits their perceived risk diversification benefits and raises the risk of contagion across financial markets.

The stronger association between crypto and equities is also apparent in emerging market economies, several of which have led the way in crypto-asset adoption between returns on the MSCI emerging markets index and Bitcoin was 0.34 in 2020–21, a 17-fold increase from the preceding years.

Stronger correlations suggest that Bitcoin has been acting as a risky asset. Its correlation with stocks has turned higher than that between stocks and other assets such as gold, investment grade bonds, and major currencies, pointing to limited risk diversification benefits in contrast to what was initially perceived.

Crypto assets have experienced tremendous growth over the past two decades, with the number of coins increasing from just Bitcoin in 2009 to over 5,000 currently, and reaching a total market capitalization of over USD 3 trillion towards the end of 2021. However, this growth has been accompanied by significant volatility, with most crypto coins going through several cycles of rapid growth followed by dramatic collapses. This is reminiscent of other periods in financial history in which private forms of money have proliferated in the absence of adequate government regulation, leading to frequent financial crises (such as in the US during the “Free Banking Era” of 1837–1863).

The rapid ascent of crypto assets, coupled with their increasing mainstream adoption, has generated concerns among policymakers and regulators, who are mindful about the potential contagion risks to other financial markets as well as the broader macro-financial. Crypto asset markets can both act as a source of shocks or as amplifiers of overall market volatility, thereby having the potential to have significant implications for financial stability. Consequently, policymakers face an imperative to enhance their comprehension of the interconnections between crypto assets and financial markets, enabling them to devise regulatory frameworks that effectively counteract the potential adverse consequences of crypto assets on financial stability.

The complex and rapidly evolving nature of the crypto market pose challenges for regulators in effectively assessing and addressing associated risks. Crypto assets encompass a wide range of technological attributes and features, serving means of payment, to store of value, speculative asset, support for smart contracts, fundraising, asset transfer, decentralized finance, privacy, digital identity, governance, among others. However, their relationship with traditional financial assets, particularly in terms of diversification potential, remains a subject of debate. While substantial research has investigated the nature, direction and intensity of linkages between crypto assets and crypto assets and other financial assets, the findings are still relatively inconclusive and paint a complex picture of interdependencies.

The multifaceted interaction channels between crypto assets and financial markets may make it challenging to assess the relationship, while it may also have changed over time.

On the one hand, a “fight-to-safety channel” would suggest that investors may allocate their funds into crypto assets during periods of economic uncertainty or market stress if cryptos are perceived as safer and offering a good hedge to certain financial assets. Crypto assets can thus provide diversification benefits if their correlation with certain classes of traditional assets is low. However, their tendency for high volatility raises important concerns. Another potential channel is the “speculative demand channel”, which would suggest that demand for crypto assets may increase during times of high financial market risk appetite, as cryptos offer the potential for high returns due to their volatility. Further channels could be related to market liquidity and to information spillovers or investor sentiment, which can lead to additional comovement between various classes of financial assets and crypto markets.

This dataset consists of the daily closing price of the five largest crypto assets by market capitalization namely Bitcoin (BTC), Ethereum (ETH), Ripple (XRP), Binance (BNB), and Tether (USDT) as of December 31st, 2021. The stock market is captured by the US S&P500 index, and we also include the Brent oil price, as well as the 10-year U.S. treasury bill as control variables to account for the possible impact of variations in commodity prices and financial condition on asset prices. The US S&P500 tracks the performance of 500 large companies in leading industries and represents a broad cross-section of the U.S. economy and is widely considered representative of the overall stock market . Tether (USDT) is a stable coin used in this study to provide insight into the inflow and outflow of funds in the market and as a tool for hedging against the volatility of the crypto market. For this reason, the USDT is likely to be more sensitive to the movement of price in the crypto market. presents a time series plot of the sampled variables. The daily datasets are in U.S. dollar currency and span from the period January 2018 to December 2021, excluding non-trading days for uniformity. Data on cryptocurrencies (Bitcoin, Ethereum, Ripple, Binance, and Tether) were retrieved from Yahoo Finance, whereas data on Brent oil, and U.S. 10-year treasury bills were retrieved from the U.S. Federal Reserve Bank of St. Louis. Additionally, the U.S. S&P500 was retrieved from Investing market indices. The baseline specification of this study considers the S&P500 index as an endogenous variable whereas cryptocurrencies and the control variables are used as dependent variables.

The increased and sizeable co-movement and spillovers between crypto and equity markets indicate a growing interconnectedness between the two asset classes that permits the transmission of shocks that can destabilise financial markets.


This analysis suggests that crypto assets are no longer on the fringe of the financial system, IMF said.

The market value of these novel assets rose to nearly $3 trillion in November from $620 billion in 2017, on soaring popularity among retail and institutional investors alike, despite high volatility. This week, the combined market capitalization had retreated to about $2 trillion, still representing an almost four-fold increase since 2017.

Amid greater adoption, the correlation of crypto assets with traditional holdings like stocks has increased significantly, which limits their perceived risk diversification benefits and raises the risk of contagion across financial markets, according to new IMF research.

By- Shannul Mawlong 50401

AI Sources: chat gpt 4

Other sources:

The Crucial Role of SQL in E-Business: Powering Data Management and Decision-Making.

Reading Time: 4 minutes


In the dynamic landscape of E-Business, where data is the lifeblood of operations and decision-making, Structured Query Language (SQL) stands as a cornerstone for efficient data management. SQL, a powerful domain-specific language, plays a pivotal role in handling and manipulating data within electronic business environments. This article explores the significance of SQL in E-Business and how it empowers organizations to leverage data for strategic advantages. However, this article is not only going to show the pluses but also it will point out the minuses that appear in systems which are using SQL. After reading this formal work, you will find out how and why SQL is important nowadays.

But, at the beginning of this article let’s answer the simple questions:

What is SQL?

SQL (Structured Query Language) is a domain-specific programming language designed for efficiently managing and manipulating relational databases. It provides a standardized syntax for defining database structures, querying data, ensuring data integrity, and optimizing database performance, making it a fundamental tool for effective data management in various applications.

SQL Advantages:

Database Management in E-Business:

  • E-Businesses deal with vast amounts of data daily, ranging from customer information to transaction records. SQL provides a standardized way to interact with relational database management systems (RDBMS), ensuring efficient storage, retrieval, and manipulation of data.
  • The relational model of SQL databases aligns seamlessly with the structured nature of business data, fostering better organization and retrieval.

Data Integrity and Security:

  • SQL incorporates robust features for maintaining data integrity, including constraints, primary keys, and foreign keys. In the context of E-Business, where accurate and secure data is paramount, these features ensure that data is consistent and reliable.
  • SQL also supports fine-grained access control, allowing businesses to define and enforce security policies, safeguarding sensitive information from unauthorized access.

Querying and Reporting:

  • One of the primary strengths of SQL lies in its ability to execute complex queries on large datasets swiftly. E-Businesses rely on SQL queries to extract valuable insights from their data, facilitating informed decision-making.
  • Reporting tools often integrate SQL for generating customized reports, dashboards, and analytics, aiding in monitoring performance, identifying trends, and forecasting.

Transaction Management:

  • In the realm of E-Business, where numerous transactions occur concurrently, SQL’s transaction management capabilities are crucial. ACID (Atomicity, Consistency, Isolation, Durability) properties ensure that transactions are executed reliably, even in the face of system failures or errors.

Scalability and Performance Optimization:

  • As E-Businesses grow, scalability becomes a vital consideration. SQL allows for the optimization of database performance through indexing, query optimization, and other tuning techniques. This ensures that as data volumes increase, the system can handle the load efficiently.

Interoperability and Standards:

  • SQL serves as a common language understood across various database systems. This interoperability is essential for E-Businesses that may rely on diverse systems and applications. The ability to communicate seamlessly with different databases enhances integration and data flow.

Adaptability to Business Changes:

  • E-Businesses are inherently dynamic, subject to constant changes in processes and requirements. SQL’s versatility enables businesses to adapt their database structures and queries quickly, accommodating evolving business needs without compromising data integrity.

SQL Disadvantages:

Learning Curve:

  • SQL, especially for complex queries and database administration, can have a steep learning curve for beginners. It requires a good understanding of the relational model and database concepts.

Cost of Implementation and Maintenance:

  • Implementing and maintaining SQL databases, especially in large-scale E-Business environments, can involve significant costs. This includes licensing fees, hardware costs, and the need for skilled personnel to manage and optimize the database.

Security Concerns:

  • While SQL databases offer robust security features, improper configurations or lack of security best practices can lead to vulnerabilities. SQL injection attacks, if not guarded against, can compromise data integrity and confidentiality.

Vendor Lock-In:

  • Different relational database management systems have their flavors of SQL, and while there are efforts to standardize SQL, there can still be variations. This can potentially lead to vendor lock-in, making it challenging to switch to another database system. For example to get current time and date from your database in MySQL system you need to call the function SELECT NOW(). In other system like Microsoft SQL Server, user should call the function SELECT GETDATE(). Based on this example it is practically impossible to switch from other SQL system to other variations of this language.


In the ever-evolving E-Business, where data is a strategic asset, SQL emerges as an indispensable tool for efficient data management, integrity, and decision-making. Its role in providing a standardized, powerful, and flexible interface to interact with databases positions SQL as a key enabler for the success and sustainability of E-Businesses in the digital era. As technology continues to advance, SQL remains a foundational language, ensuring that businesses can harness the full potential of their data to drive innovation and growth. On the other hand, SQL creates a lot of costs. First, the organizations should hire well-specialized experts in SQL or create opportunities for their employees to learn SQL. Second, maintaining SQL is a big challenge which requires a lot of effort. Finally, non-secured SQL systems can be easy targets for external attack.

My opinion:

I think that SQL is an amazing tool for e-commerce. Moreover, I would say that today’s e-business would not look as is looking now because of it.  SQL gives so much like, for example predicting future choice of clients that it must be used in every organization to maintain on the market. Nevertheless, I have the opinion that we as managers should take into consideration the fact that SQL is an expensive investment, and we need to learn it to use it all potential.

I am looking forward to get your feedback about SQL in E-Commerce.



AI Engine:

  • ChatGPT 4.0

Tagged , ,


Reading Time: 4 minutes


As the world’s leading Internet television network with over 160 million members in over 190 countries, our members enjoy hundreds of millions of hours of content per day, including original series, documentaries and feature films. Of course, all our all-time favourites are right on our hands, and that is where machine learning has taken it’s berth on the podium. This is where we will dive into Machine Learning.


Machine learning impacts many exciting areas throughout our company. Historically, personalization has been the most well-known area, where machine learning powers our recommendation algorithms. We’re also using machine learning to help shape our catalogue of movies and TV shows by learning characteristics that make content successful. Machine Learning also enables us by giving the freedom to optimize video and audio encoding, adaptive bitrate selection, and our in-house Content Delivery Network.

I believe that using machine learning as a whole can open up a lot of perspectives in our lives, where we need to push forward the state-of-the-art. This means coming up with new ideas and testing them out, be it new models and algorithms or improvements to existing ones.

Operating a large-scale recommendation system is a complex undertaking: it requires high availability and throughput, involves many services and teams, and the environment of the recommender system changes every second. In this we will introduce RecSysOps a set of best practices and lessons that we learned while operating large-scale recommendation systems at Netflix. These practices helped us to keep our system healthy:

 1) reducing our firefighting time, 2) focusing on innovations and 3) building trust with our stakeholders.

RecSysOps has four key components: issue detection, issue prediction, issue diagnosis and issue resolution.

Within the four components of RecSysOps, issue detection is the most critical one because it triggers the rest of steps. Lacking a good issue detection setup is like driving a car with your eyes closed.


The very first step is to incorporate all the known best practices from related disciplines, as creating recommendation systems includes procedures like software engineering and machine learning, this includes all DevOps and MLOps practices such as unit testing, integration testing, continuous integration, checks on data volume and checks on model metrics.

The second step is to monitor the system end-to-end from your perspective. In a large-scale recommendation system there are many teams that often are involved and from the perspective of an ML team we have both upstream teams (who provide data) and downstream teams (who consume the model).

The third step for getting a comprehensive coverage is to understand your stakeholders’ concerns. The best way to increase the coverage of the issue detection component. In the context of our recommender systems, they have two major perspectives: our members and items.

Detecting production issues quickly is great but it is even better if we can predict those issues and fix them before they are in production. For example, proper cold-starting of an item (e.g. a new movie, show, or game) is important at Netflix because each item only launches once, just like Zara, after the demand is gone then a new product launches.

Once an issue is identified with either one of detection or prediction models, next phase is to find the root cause. The first step in this process is to reproduce the issue in isolation. The next step after reproducing the issue is to figure out if the issue is related to inputs of the ML model or the model itself. Once the root cause of an issue is identified, the next step is to fix the issue. This part is similar to typical software engineering: we can have a short-term hotfix or a long-term solution. Beyond fixing the issue another phase of issue resolution is improving RecSysOps itself. Finally, it is important to make RecSysOps as frictionless as possible. This makes the operations smooth and the system more reliable.


To conclude In this blog post I introduced RecSysOps with a set of best practices and lessons that we’ve learned at Netflix. I think these patterns are useful to consider for anyone operating a real-world recommendation system to keep it performing well and improve it over time. Overall, putting these aspects together has helped us significantly reduce issues, increased trust with our stakeholders, and allowed us to focus on innovation.




[1] Eric Breck, Shanqing Cai, Eric Nielsen, Michael Salib, and D. Sculley. 2017. The ML Test Score: A Rubric for ML Production Readiness and Technical Debt Reduction. In Proceedings of IEEE Big Data.Google Scholar

[2] Scott M Lundberg and Su-In Lee. 2017. A Unified Approach to Interpreting Model Predictions. In Advances in Neural Information Processing Systems 30, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett(Eds.). Curran Associates, Inc., 4765–4774.

When life gives you data, make an analysis!

Reading Time: 2 minutes

… or a day in the life of a data analyst 

Data analysts, who are entrusted with converting raw data into insights that can be used to assist decision-making, are the unsung heroes of the data industry. Large amounts of data need to be gathered, cleaned and analyzed as part of their profession in order to find trends, patterns, and linkages that could otherwise go missed. In a time when data is king, data analysts are essential in assisting businesses in making sense of the massive volumes of data they produce every day.

A data analyst’s day can be varied and difficult, involving everything from gathering and cleaning data to examining and visualizing it to developing and testing predictive models. In addition to having a solid grasp of statistics, data visualization, and machine learning, data analysts must be able to concisely and clearly convey their findings to stakeholders. In order to comprehend the business context of the data and guarantee that their research meets the objectives of the stakeholders, they must also be able to work collaboratively with cross-functional teams that include engineers, product managers, and business analysts.

One of the most important tools in the data analyst’s arsenal is the programming language Python, which has become the de facto language for data analysis and data science. Python offers a wealth of libraries and tools that make it easy to perform data analysis tasks, such as collecting data, cleaning data, exploring data, and building predictive models.

Python, a programming language that has established itself as the standard for data analysis and data science, is one of the most crucial weapons in the toolbox of the data analyst. Python has an abundance of modules and tools that make it simple to carry out data analysis tasks like gathering data, cleaning data, examining data, and developing predictive models.

Here are some of the most common Python libraries used for data analysis:

  • Pandas: A fast, flexible, and powerful data analysis and manipulation library, used for tasks such as data cleaning, aggregation, and transformation.
  • Numpy: A library for numerical computing in Python, used for tasks such as linear algebra, random number generation, and array operations.
  • Matplotlib: A 2D plotting library for Python, used for tasks such as data visualization, histograms, and scatter plots.
  • Seaborn: A data visualization library based on Matplotlib, used for tasks such as regression plots, heatmaps, and violin plots.
  • Scikit-learn: An open-source library for machine learning in Python, providing a wide range of algorithms for classification, regression, clustering, and dimensionality reduction.
  • TensorFlow: A popular open-source platform for developing and training ML models, used for a wide range of tasks including image recognition, natural language processing, and time series forecasting.
  • PyTorch: An open-source ML framework for building and training deep learning models, used for tasks such as image classification, sequence analysis, and recommendation systems.

To conclude, data analysts are essential to helping businesses understand the enormous amounts of data they produce every day. To transform data into insights and encourage reasoned decision-making, they combine technical abilities, like Python programming and machine learning, with soft skills, like cooperation and communication. The world of data is an exciting and gratifying place to be, and there are endless opportunities for growth and development whether you are an experienced data analyst or just getting started.


ChatGPT’s new competitor

Reading Time: 3 minutes

More powerful than ChatGPT': Microsoft unveils new AI-improved Bing and Edge  browser | ZDNET

Bing is an updated Microsoft search service based on artificial intelligence. It’s based on the OpenAI GPT language model, but Bing is newer than ChatGPT 3.5. Microsoft says it’s not just an updated search engine, but a new artificial intelligence-based search channel with a new chat interface that offers better searches, more complete answers and more relevant search results, so readers can spend less time on the page. Artificial intelligence will revolutionize every category of software, including the largest category — search. Bing can also create content and inspire creativity. Microsoft said: “The new Bing can generate useful content. Create a 5-day itinerary for your dream vacation to Hawaii, including links to write emails, book travel and accommodation, prepare for interviews and create quiz questions to help you The new Bing also cites all sources so you can see links to the web content you link to.”

Microsoft has also announced changes to Edge. Artificial intelligence has been added to Edge to help people do more with search and the internet. As for the new Bing search and the new Edge browser, Microsoft highlights some key features:

  • The best search. The new Bing offers an improved version of familiar search, providing more relevant results for simple things like sports scores, promotions and weather, as well as more complete when you need it. It also provides a new sidebar for displaying responses.
  • Full answer. Bing searches the web for results to find and summarize the answers you are looking for. For example, you can get step-by-step instructions on how to replace eggs with another ingredient in your current cake without looking at multiple results.
  • New chat. For more complex searches, such as planning a detailed travel itinerary or choosing a TV to buy, the new Bing offers a new interactive chat. Chat allows you to narrow down your search until you get the full answer you are looking for, asking for details, clarity and ideas. Links are available, so decisions can be made immediately.
  • New Microsoft Edge interface. We have updated the Edge browser with new artificial intelligence features, a new look and added two new features: chat and messaging. Use the Edge sidebar to request summaries of long financial reports to get the main conclusions, use the chat function to request comparisons with competitors’ financial reports, and automatically place them in a spreadsheet. You can also ask Edge to help you create content, such as posts for LinkedIn. Then you can get help updating the tone, format and length of your message. Edge can understand the web pages you are viewing and adapt accordingly.

However, Google issued a warning to its departments, and even the founders and shareholders of the tech giants Larry Page and Sergey Brin stepped up. On Monday, the company introduced its own alternative to ChatGPT called Bard. Google CEO Sundar Pichai called the software an “experimental artificial intelligence service” that is still being tested by a limited number of users and employees of the company and will be released to the general public in the coming weeks.

Microsoft Brings ChatGPT-Like AI Features to Bing, Edge - My TechDecisions

Thus, Bing have been developed to facilitate research and increase their reliability. Starting with the chat mode, you can ask literally any question using an interface very similar to GPT chat, and the answer will be sent in seconds.

Interestingly, when searching for information in real time on the Internet, responses are sent directly from various thematic sites. The source of information for constructing the answer is shown as a footnote, but the user is redirected to the main page of the site in question, and not to the page with the text.

Sources and references:

Tagged , ,

Deepmind’s AlphaCode Satisfactory in a Programming Competition

Reading Time: 3 minutes
Source: Maciek905/Dreamstime stock image

AI code generation systems are a type of artificial intelligence technology that is capable of automatically generating code. These systems have the potential to revolutionize the way software is developed, making it faster and more efficient.

One of the main benefits of AI code generation systems is their ability to save time. These systems can analyze a given problem and automatically generate a solution in the form of code. This can significantly reduce the amount of time it takes for developers to write code from scratch. Additionally, these systems can often generate code that is more efficient and optimized than code written by humans, which can lead to faster and more reliable software.

Another benefit of AI code generation systems is their ability to improve the accuracy and reliability of code. By analyzing a problem and generating a solution, these systems can help eliminate human error that can lead to bugs and other issues in software. This can help reduce the time and resources needed for debugging and testing, which can save money and improve the overall quality of the software.

One of the main challenges of AI code generation systems is their reliance on data. These systems need large amounts of data to learn and generate code, which can be a problem if the data is not available or is of poor quality. Additionally, these systems are only as good as the algorithms and models they are based on, and it can be difficult to design and train these models to generate high-quality code.

Despite these challenges, there has been significant progress in the development of AI code generation systems in recent years. One example is the development of “neural machine translation” systems, which are capable of automatically translating text from one language to another. These systems have been able to achieve impressive levels of accuracy, and they have been widely adopted in a variety of industries.

Another example is the development of “auto-coding” systems, which are capable of generating code for a variety of programming languages. These systems have the potential to significantly reduce the time and effort required to develop software, and they are being explored by a number of companies and organizations.

Examining the abilities of AI code generation systems can be tricky. One means of doing so is to place the system in a programming competition against regular human programmers. A recent experiment of that kind was performed by Deepmind. Deepmind, a subsidiary of Alphabet Inc. is a trailblazing artificial intelligence research laboratory. The experiment was carried out with the use of its AlphaCode deep learning algorithm. AlphaCode converts user input into functioning code by first rewriting it as an action plan. It transforms it into set steps and finally turns it into fully working code. AlphaCode achieved an ‘average’ rating in the competition. A promising acceleration for AI code generation systems.

Overall, AI code generation systems have the potential to revolutionize the way software is developed. These systems can save time and improve the accuracy and reliability of code, and they have already made significant progress in a number of areas. However, there are still challenges to be addressed in terms of data availability and model design, and it will be interesting to see how these systems continue to evolve and improve in the coming years.


DeepMind. “Competitive programming with AlphaCode.” Deepmind. Published December 8, 2022.

Li, Yujia et al. “Competition-level code generation with AlphaCode.” Science. Published December 8, 2022.

Kolter, J. Zico. “AlphaCode and “data-driven” programming.” Science. Published December 8, 2022.

Deepmind. “AlphaCode Attention Visualization.” Deepmind. Accessed January 9, 2023.