Welcome to curated list of handpicked free online resources related to IT, cloud, Big Data, programming languages, Devops. Fresh news and community maintained list of links updated daily. Like what you see? [ Join our newsletter ]

Rapid progress in AI research and development faces hurdles

Categories

Tags ai analytics big-data cio

Major Chinese tech companies last week announced extensive price cuts for their large language model (LLM) products used for generative artificial intelligence, a move that experts say is tipped to speed up the application of AI models in the domestic market and in research. While price wars are not uncommon in the world of LLMs, this comes as Chinese developers continue to make progress in the commercialisation of AI technology in recent months. By Amber Wang.

Lin Yonghua, deputy director and chief engineer of the Beijing Academy of Artificial Intelligence (BAAI), a leading non-profit in AI research and development, said the “bottleneck” around training data is looming large.

The article then explains:

  • Need for localised models
  • Filling the data gap
  • The GPU bottleneck

Industry discussions also focus on the strong demand for computing power. Lack of computing power inhibits the implementation of LLMs and delays product deployment, they say. Good read!

[Read More]

Qualcomm: Generative AI and impressive diversification signal a new era of growth

Categories

Tags infosec management ai cio machine-learning

QCOM’s entry into the AI chips market seems promising, with MSFT endorsing the ARM-based Snapdragon X Elite CPU chips as the “fastest, most AI-ready PC ever built.” By Juxtaposed Ideas.

MSFT has recently launched new AI-powered personal computers with QCOM, namely Copilot+ PCs, powered by the latter’s latest ARM-based Snapdragon X Elite CPU chips. With OpenAI’s multimodal GPT-4o model embedded on these PCs while supposedly boasting faster processing speeds and more efficient batteries than AAPL’s most advanced MacBook Air with the M3 processor, it is unsurprising that MSFT has touted these as the “fastest, most AI-ready PC ever built.”

For context, the Neural Processing Unit [NPU] is touted to be the key in enhancing AI performance in a PC, given that it simulates the “human brain’s neural network” in processing “large amounts of data in parallel, performing trillions of operations per second.” Interesting read!

[Read More]

Shift left vs shift right: A DevOps mystery solved

Categories

Tags devops machine-learning tdd docker containers

Shift left and shift right are core testing concepts of the agile DevOps methodology, which speeds up application development by releasing small builds frequently as code evolves. As part of the continuous cycle of progressive delivery, DevOps teams are also adopting shift left and shift right principles to ensure software quality in these dynamic environments. By Saif Gunja.

The article then explains:

  • In DevOps, what is shift left? And what is shift right?
  • Why is shift left important?
  • What does shift right mean in DevOps?
  • Why shift right and shift left are critical for microservices architecture
  • Types of shift right tests
  • Types of shift left tests
  • The application security dividend of shift right and shift left

… and more. Shift right is the practice of performing testing, quality, and performance evaluation in production under real-world conditions. Shift right methods ensure that applications running in production can withstand real user load while ensuring the same high levels of quality. Nice one!

[Read More]

Big data strategies

Categories

Tags analytics big-data app-development management cio how-to

Do I have “big data”? Oddly, this is not actually a straightforward question for two reasons. By @practicaldatascience.org.

As a result, the fact that you have 16GB of RAM doesn’t mean you can easily work with a 14GB file. As a general rule, you need at least two times as much memory as your file takes up when first loaded.

If you have big data, you basically have four options:

  • Use chunking to trim and thin out your data so it does fit in memory (this works if the data you were given is huge, but the final analysis dataset you want to work with is small)
  • Buy more memory. Seriously, consider it
  • Minimize the penalties of working off your hard drive (not usually practical for data science)
  • Break your job into pieces and distribute over multiple machines

If your program starts using more space than you have main memory, your operating system will usually just start using your hard drive for extra space without telling you (this is called “virtual memory”, and is nice in that it prevents your computer from crashing, though it will slow stuff down a lot). As a result, you won’t always get an error message if you try and load a file to is much bigger than main memory. More details on how do I check to see if my data is fitting in memory in the article. Interesting read!

[Read More]

To end burnout, cybersecurity must tolerate failure

Categories

Tags infosec management web-development cio

At Gartner Security and Management Summit, speakers explained that you can’t buy your way out of cybersecurity burnout. They discussed how a shift from a protection-focused program to a response-focused security strategy can help. By Ijeoma S. Nwatu.

“It is hard simply to survive amid the complicated operating environment you have to defend,” Mixter began. “It gets even harder as budget growth starts to level off, and as the cybersecurity talent gap expands, both here in North America and all around the world. The talent shortage is a very real threat to companies and organizations worldwide. The number of cyber talent we need might as well be a billion for all that we’re going to solve in the near term.”

To kick off the 2024 Gartner Security & Risk Management Summit last week, Gartner’s vice president analysts, Christopher Mixter and Dennis Xu opened the three-day conference with an introduction to augmented cybersecurity with a focus on leaving the zero-tolerance-for-failure mindset behind.

62% of cybersecurity leaders have experienced burnout at least once in the past year,” Xu shared according to Gartner. “You all are doing good work, but you shouldn’t have to be heroes. Interesting read!

[Read More]

Everything Apple Intelligence will do for you (so far)

Categories

Tags ai app-development big-data data-science robotics

While the arrangement between OpenAI and Apple is attracting a lot of attention, Apple has put together a sizable number of its own large language model (LLM) tools that will run on a compatible device or in its secure cloud, Private Cloud Compute. By Jonny Evans.

To achieve this, it draws on what your device knows about you and on-device intelligence, or, where necessary, in the cloud via the highly secure Private Cloud Compute system. At all times, Apple says it’s working to protect user privacy, which means your data is protected unless you choose to use a third-party AI, such as ChatGPT.

The article deals with:

  • Tools to help you write better
  • Mail is getting better
  • Meetings, now with AI assistants
  • Tools to help you stay focused
  • Making images
  • Photos gets better at helping you find your stuff
  • Wave your Image Wand
  • Siri gets serious attention

It is likely there will be additional features in place by the time Apple Intelligence is made available in the fall product software updates. This is because developers can use App Intents to make features available within their apps also available across the system. Meanwhile, developers get to use Xcode Complete to work smarter. Good one!

[Read More]

Big data statistics 2023: How much data is in the world?

Categories

Tags fintech app-development big-data data-science miscellaneous

You’ve probably noticed that the term “big data” is frequently used these days. That’s because big data is a major player in the digital age. This term refers to complex and massive data sets that far exceed the potential of traditional data processing applications. By Ogi Djuraskovic.

The article explains stats about:

  • General overview of big data statistics and facts
    • There is 10% unique and 90 % replicated data in the global datasphere
    • Amount of data/information generated, gathered, copied, and consumed to reach 180 zettabytes by 2025
  • Overview of data analytics stats and facts
  • Corporate big data statistics and facts

    • Microsoft is the largest vendor in the worldwide big data and analytics software market with a 12.8% market share
  • Cloud computing and big data centers statistics

    • In 2021, the US is the country with the most data centers (2670) in the world
    • In 2020, businesses spent $129.5 billion on cloud infrastructure services and data centers
    • Global colocation data center market revenue could increase to more than $58 billion by 2025

… and more detailed data could be found in this article. Through big data analytics, modern businesses can forecast market fluctuations, discover more lucrative business opportunities, improve their efficiencies, beat their competitors, and provide a more customer-centric service and experience. Good read!

[Read More]

Squaring the circle: The high-performance computing energy paradox

Categories

Tags fintech robotics servers blockchain how-to

The latest CPUs and GPUs will consume more power than ever to support artificial intelligence (AI) and other advanced applications. How is this compatible with data centers’ energy efficiency imperatives? By Graeme Burton.

There’s a paradox at the heart of the current rush to roll-out resources to support AI: on the one hand, the compute-intensive AI applications coming down the line will require radically more powerful GPUs, drawing far greater kilowatts, in order to run those applications efficiently. Organizations, public and private, cloud providers and data center operators are therefore turning to high performance computing (HPC).

The article then takes under considerations:

  • Power efficiency
  • In or out? The case for HPC in the cloud
  • The power of two

Both AWS and Nvidia have collaborated to bring software and toolkits to help improve the skills of HPC users across the span of their workflows. Nvidia has developed SDKs that help users tackle the challenges of deploying HPC workloads across GPU technologies, optimizing applications to run on the Nvidia platform,” notes Hyperion Research. Interesting read!

[Read More]

Traefik 3.0: Deep dive into wasm support with Coraza WAF plugin

Categories

Tags web-development app-development javascript how-to devops

Custom plugins for Traefik are one of the most requested features going back to the early days of the project, starting with this issue, from back in 2017. Today, we will deep dive into WebAssembly support. By Emile Vauge.

The article then provides overview of:

  • WASM overview
  • Traefik + WASM
  • Traefik + Coraza

Traefik already had an extension engine based on the Yaegi Go interpreter. It is extremely easy to set up and provides a powerful foundation for customizing Traefik. However, it requires writing plugins in Go, and some people could find this too restrictive. WebAssembly has gained huge popularity lately and provides exactly what is needed to build a perfect plugin engine: portable binaries, open standard, fast, multiple languages support, and multiple host environments. Traefik 3.0 adds full support of WASM plugins for middlewares and allows to run binary code compiled from many different languages like C, Rust, or JavaScript. Good read!

[Read More]

Uncovering financial crime: Going beyond rule-based approaches with a connected perspective

Categories

Tags fintech app-development management cio blockchain how-to

Money laundering, the process of disguising the origins of illicit funds, poses a significant challenge in the financial sector worldwide as it serves as the lifeblood for organizations that engage in illicit activities such as terrorism, drug trafficking, and human trafficking. By Sean Robinson.

The article paints the picture about:

  • Understanding money laundering
  • Graph database technology: An overview
  • Leveraging AML (Anti Money Laundering) graph technology

In conclusion, AML graph database technology offers a powerful approach to detecting and investigating money laundering schemes, leveraging its ability to uncover complex relationships and patterns within vast datasets. As financial crimes continue to evolve, embracing AML graph database technology is crucial for financial institutions to stay ahead of emerging threats and safeguard the integrity of the global financial system. Interesting read!

[Read More]