Welcome to curated list of handpicked free online resources related to IT, cloud, Big Data, programming languages, Devops. Fresh news and community maintained list of links updated daily. Like what you see? [ Join our newsletter ]

Underrated Scala features and hidden gems in the standard library

Categories

Tags akka java kotlin app-development scala

A data structure which manages resources automatically. It lets us focus on the task at hand by giving us a handle on the acquired resource which is then automatically released in the end so that we avoid resource leaks. By Anzori (Nika) Ghurtchumelia.

The article then describes some useful utilities in Scala:

  • PartialFunction[-A, +B]
  • scala.util.chaining
  • Function types
  • Implicits & term inference
  • Local functions
  • lazy vals & by name parameters
  • Type aliases

You will get examples for each mentioned utility. Interesting read!

[Read More]

Kotlin or Java for Android app development which one should you choose?

Categories

Tags android java kotlin app-development web-development

Android app development is a big and ever-growing industry. Kotlin or Java for Android App Development lets understand more about these technologies. By cumulations.com.

If you are confused about making the right choice between the two, read this article to decide which would be the right to go for. In the article you will get information on:

  • Background of Kotlin or Java for Android app development
  • Community support
  • Extensions
  • Performance
  • Syntax
  • Scalability
  • Learning curve
  • Smart casts
  • Which and when to use?

In the end, the choice of using one over the other will depend on your personal preference; convenience; and the project need. Both have their drawbacks and advantages so you have to be careful in making the selection. Don’t be in hurry or else you will end up messing up everything. Nice one!

[Read More]

Mastering Gradle dependency management with version catalogs: A comprehensive guide

Categories

Tags android learning kotlin app-development code-refactoring

In complex and modular Android projects, managing dependencies can be a daunting and time-consuming task. Gradle Version Catalogs, introduced in Gradle 7.0 and promoted to stable in version 7.4, offer an elegant solution to streamline dependency management. In this comprehensive guide, we will explore the benefits of Version Catalogs, demonstrate how to implement them in an Android project with code snippets and provide tips to help you get the most out of this powerful feature. By Kashif Mehmood.

The article then explains why use version catalogs:

  • Centralized and shareable configuration: By consolidating dependency coordinates in a single file, Version Catalogs simplify management and promote consistent configuration across multiple projects. 8 Improved performance: Compared to buildSrc, Version Catalogs offer better performance, as updating dependency versions no longer requires a complete rebuild.
  • Flexibility: Version Catalogs support the creation of dependency bundles, enabling developers to add a single implementation line for a set of libraries in their Gradle files.
  • Version Catalogs support third-party plugins for automatic version updates and offer better performance compared to buildSrc solutions.Type-safe dependencies: Version Catalogs encourage type-safe dependency declarations, reducing typos and improving IDE support for content assistance.

It also provides an example how to implement version catalogs in an Android project with code snippets and configuration. Good read!

[Read More]

Eying efficiency: This is data's opportunity

Categories

Tags data-science cio how-to big-data machine-learning

Data is the biggest value driver for businesses bringing positive change and enabling near term efficiency. Maintaining an eye on the future is so important to businesses - and it’s why data leadership is of rising value. Although currently there’s a great deal of economic uncertainty, the data community should feel optimistic; this is the opportunity to demonstrate business value, benefit colleagues and play a role in efficiency and sustainability initiatives. By Danielle McConville.

Data leads to answers for those difficult questions business leaders are challenged with. Yet business advisory firm McKinsey finds that data-driven businesses achieve sales growth of between 15 and 25 percent in business-to-business lines.

Few points mentioned in teh article:

  • Why have a data strategy
  • A better understanding

With the right type of data strategy in place, organisations can leverage a data-driven approach wherein they have a true understanding of where they are currently as a business, where the business needs to go next, and what strategic assets it has to get there. In addition, data can reveal industry trends or disruptions caused by technology or market conditions.

The data opportunity is more than just a short-term efficiency gain. Leading businesses are using data to gain a deeper and more informed understanding of every element of the organisation. McKinsey reports: “data-driven culture fosters continuous performance improvement to create truly differentiated customer and employee experiences.” This is leading to finance optimisation, lean inventory management, customer journey insights and buyer behaviour studies. Good read!

[Read More]

Real-time data linkage via Linked Data Event Streams

Categories

Tags data-science streaming performance how-to big-data apache

Real-time interchanging data across domains and applications is challenging; data format incompatibility, latency and outdated data sets, quality issues, and lack of metadata and context. A Linked Data Event Stream (LDES) is a new data publishing approach which allows you to publish any dataset as a collection of immutable objects. The focus of an LDES is to allow clients to replicate the history of a dataset and efficiently synchronize with its latest changes. By towardsai.net.

Using Linked Data Event Stream (LDES), data can be fluently shared between different systems and organizations. In this way, companies and organizations can ensure that their data is well-structured, interoperable, and easily consumable by other systems and services. LDES has emerged as a standard for representing and sharing up-to-date data streams.

Further in the article:

  • Linked Data Event Streams explained in 8 minutes
  • Onboarding of a Linked Data Event Stream
  • Interlink multiple Linked Data Event Streams
  • SPARQL query
  • Combining multiple data streams via their semantics

To replicate the whole proof of concept (LDES 2 GraphDB), please go to provided Github repository. It describes how to set up the GraphDB and Apache NiFi via docker, after which the data flow can be started using the supplied Apache NiFi setup file. Nice one!

[Read More]

Comparisons of proxies for MySQL

Categories

Tags mysql database performance how-to devops

HAProxy, ProxySQL, MySQL Router (AKA MySQL Proxy); in the last few years, I had to answer multiple times on what proxy to use and in what scenario. When designing an architecture, many components need to be considered before deciding on the best solution. By Marco Tusa.

When deciding what to pick, there are many things to consider, like where the proxy needs to be, if it “just” needs to redirect the connections, or if more features need to be in, like caching and filtering, or if it needs to be integrated with some MySQL embedded automation.

sysbench ./src/lua/windmills/oltp_read.lua  --db-driver=mysql --tables=200 --table_size=1000000 
 --rand-type=zipfian --rand-zipfian-exp=0 --skip_trx=true  --report-interval=1 --mysql-ignore-errors=all 
--mysql_storage_engine=innodb --auto_inc=off --histogram  --stats_format=csv --db-ps-mode=disable --point-selects=50 
--reconnect=10 --range-selects=true –rate=100 --threads=<#Threads from 2 to 4096> --time=1200 run

The article then describes:

  • The environment
  • The tests
  • Conclusions

HAProxy comes up as the champion in this test; there is no doubt that it could scale the increasing load in connection without being affected significantly by the load generated by the requests. The lower consumption in resources also indicates the possible space for even more scaling. For further details, charts and tests results follow the link to the original article. Interesting!

[Read More]

Developers journey to AWS Lambda

Categories

Tags serverless cloud aws devops microservices learning

AWS Lambda has a surprise learning curve. You create a new function, write your code, and it executes. Easy, right? Then you discover just how deep the rabbit hole goes. Deluged by so many topics, it’s hard to know where to go next. By Stephen Sennett.

In this post, authors will break down five steps to get you moving along your Lambda journey:

  • Improving your code
  • Leveraging CI/CD and frameworks
  • Easy observability with PowerTools
  • Tweaking your deployment
    • Function scaling with reserved concurrency
    • Cold starts and provisioned concurrency
    • Managing changes with versions and aliases
    • Tuning your resource configuration
  • Architecting for serverless

The learning with Lambda does not stop here. There are still many areas of the service and development to be further explored! You can find answers to almost any problem between the Developers Guide and Operators Guide. AWS has made it truly possible to become an expert at their services by being able to navigate their documentation, and Lambda is no exception. Also there are links included in the article to some awesome videos from AWS re:Invent 2022. Good read!

[Read More]

NGINX tutorial: How to use OpenTelemetry tracing to understand your microservices

Categories

Tags cloud nginx monitoring devops microservices servers

A microservices architecture comes with many benefits, including increased team autonomy and increased flexibility in scaling and deployment. On the downside, the more services in a system (and a microservices app can have dozens or even hundreds), the more difficult it becomes to maintain a clear picture of the overall operation of the system.Observability tooling gives us the power to build that picture across numerous services and supporting infrastructure. By Vijay Kanade.

Telemetry – The act of gathering metrics, traces, and logs and transferring them from their point of origin to another system for storage and analysis. Also, the data itself.

In this tutorial, we highlight one very important type of observability for microservices apps: tracing. The article further covers:

  • Tutorial architecture and telemetry goals
  • Architecture and user flow
  • Set up basic OpenTelemetry instrumentation
  • Set up OpenTelemetry instrumentation and trace visualization for all services
  • Configure OpenTelemetry instrumentation of NGINX
  • Learn to read OpenTelemetry traces
  • Optimize instrumentation based on trace readings

In a production environment, you might want to add things like custom spans for each database query and additional metadata on all spans describing runtime details such as each service’s container ID. You could also implement the other two types of OTel data (metrics and logging) to give you a complete picture of the health of your system. Nice one!

[Read More]

What is virtual memory? Meaning, architecture, benefits and challenges

Categories

Tags cloud cio linux miscellaneous servers

Virtual memory speeds up the execution of heavier applications without running out of memory. Virtual memory is defined as a memory management method where computers use secondary memory to compensate for the scarcity of physical memory. By Vijay Kanade .

This insightful article captures:

  • What is virtual memory?
  • Types of virtual memory
  • Virtual memory architecture
  • Benefits and challenges of virtual memory

Virtual memory is inevitable in today’s operating systems like Windows 10, Win XP, Win 7, Android, Linux, etc. Users can combine physical and virtual memory to speed up the RAM performance. It allows the system’s OS to run multiple programs, such as web surfing, writing a research paper, and executing software code simultaneously without exhausting memory. Interesting read!

[Read More]

Humanness in the age of AI

Categories

Tags crypto big-data cloud cio data-science miscellaneous

A path to an open and permissionless identity protocol. The Worldcoin project is initiating an open and permissionless identity protocol called World ID. It empowers individuals to verify their humanness online while maintaining their anonymity through zero-knowledge proofs. By @worldcoin.org.

Advancements in AI make it difficult to distinguish between AI and humans on the internet, highlighting a need for authentic human recognition and verification. To help address this, individuals can receive a future-proof unique human credential through a secure biometric device in a privacy-preserving manner. Importantly, this enables a path to AI-funded non-state UBI and the equitable global distribution of digital currencies.

The article covers:

  • The need for proof of personhood
  • A world with proof of personhood
  • Proof of personhood requirements
  • Potential proof of personhood mechanisms
  • Making proof of personhood a reality

The performance of deep learning models is improving at an accelerating pace, reaching superhuman levels in benchmarks with ever-increasing speed. Language models have recently experienced significant advancements, attaining performance within the upper 20th percentile of human capabilities on the majority of conventional assessments. When comparing GPT-3.5 to GPT-4, separated by only a few months, the rate of progress becomes especially clear.

Access to increasingly powerful models is becoming available to individuals in ways that are impossible to control. The Stable Diffusion image generation model and software to generate deep fakes are open source, and Meta’s LLaMA language model has been leaked and can be run on a laptop. Very interesting read!

[Read More]