SWIG Introduction SWIG is short for Simplified Wrapper and Interface Generator, a development tool that links C and C++ programs with various other high-level languages such as Perl, Python, Ruby and Tcl. Languages currently supported. C# – Mono C# – MS.NET D Go language Guile Java Javascript – Node.js Javascript – V8 Javascript – WebKit Lua MzScheme/Racket OCaml Octave Perl PHP Python R Ruby Scilab Tcl/Tk SWIG with Python With
Build an image server with zimg
Generally, large websites store their images on dedicated servers, which can be a good way to improve the performance of the website. A simpler way is to use the services provided by cloud vendors. Today we are going to introduce an open source implementation of the solution zing. zimg Introduction zimg is a set of Chinese open source programs designed and developed for image processing servers, aiming to solve the
Python string fuzzy matching library FuzzyWuzzy
In computer science, fuzzy string matching is a technique for finding a string that matches a pattern approximately (rather than exactly). In other words, fuzzy string matching is a search that finds a match even if the user misspells a word or enters only part of a word to search. Therefore, it is also known as string approximate matching. String fuzzy search can be used in various applications, such as
Compressing and decompressing files under Linux
Compressing and decompressing files under Linux is quite different from the Windows environment. Under Windows, you only need to install a tool like Winrar to decompress most files, while under Linux command line there are different compression and decompression methods for each kind of file. Common Linux compression and decompression commands Using tar to package files The most common packaging program under Linux is tar (note that tar is packaging,
HTML page parsing and extraction tools lxml and XPath
In the process of data crawling, often encountered the need to parse the content of HTML, commonly used is to use regular expressions, today mainly to introduce the lxml tool and the syntax of xpath.
Introduction to lxml lxml is a high-performance Python XML library that natively supports XPath 1.0, XSLT 1.0, custom element classes, and even a Python style data binding interface. It is built on top of two C libraries: libxml2 and libxslt, which provide the main power for performing core tasks such as parsing, serialization, and conversion.
Building K8S Full Stack Monitoring with Elastic Technology Stack (4/4)
Elastic APM is a tool for application performance monitoring on Elastic Stack that allows us to monitor application performance in real time by collecting incoming requests, database queries, cache calls, and more. This makes it easier for us to quickly pinpoint performance issues.
Elastic APM is OpenTracing compatible, so we can use a large number of existing libraries to track application performance.
For example, we can trace a request in a distributed environment (microservice architecture) and easily find possible potential performance bottlenecks.
Building K8S Full Stack Monitoring with Elastic Technology Stack (3/4)
In this section we will install and configure Filebeat to collect log data from a Kubernetes cluster and send it to ElasticSearch, Filebeat is a lightweight log collection agent that can also be configured with specific modules to parse and visualize the log format of applications (e.g., databases, Nginx, etc.).
Similar to Metricbeat, Filebeat requires a configuration file to set the link information to ElasticSearch, the connection to Kibana, and the way logs are collected and parsed.
Building K8S Full Stack Monitoring with Elastic Technology Stack (2/4)
In this article, we will use Metricbeat to monitor Kubernetes clusters, as we have already installed and configured the ElasticSearch cluster in the previous article. Metricbeat is a lightweight collector on the server that is used to collect monitoring metrics for hosts and services on a regular basis. This is the first part of our build of Kubernetes full-stack monitoring. Metribeat collects system metrics by default, but also includes a
Building K8S Full Stack Monitoring with Elastic Technology Stack (1/4)
In this series of articles, we will learn how to use the Elastic technology stack to build a monitoring environment for Kubernetes. The goal of observability is to provide an operations tool for production environments to detect when a service is unavailable (e.g., if the service is down, has errors, or is slow to respond) and to keep some troubleshooting information to help us pinpoint the problem. In summary, there
Write an interface pressure testing tool
Some time ago, a project was about to go live, and we needed to pressure test the core interface; since our interface is gRPC protocol, we found that there are not as many pressure testing tools as HTTP.
Finally I found the tool ghz, which is also very full-featured.
Afterwards I wondered why there are so few tools for gRPC piezos, what are the difficulties? In order to verify this problem, I am going to write a tool myself.
Python exception retry solution
When data crawling, we often encounter program exceptions due to network problems, and the initial approach is just to record the error content and post-process the error content again. Some better exception retry methods or mechanisms are compiled here.
Initial version :
1 2 3 4 5 6 7 8 9 10 11 def crawl_page(url): pass def log_error(url): pass url = "" try: crawl_page(url) except: log_error(url) Improved version (increased number of retries):
Tips for using the Python network request library Requests
The Requests library is used to make standard HTTP requests in Python. It abstracts the complexity behind the request into a nice, simple API so you can focus on interacting with the service and using the data in your application. Requests POST/GET parameters Commonly used parameters are listed in the following table. Requests Return object Response common methods Common properties and methods of Response class. 1 2 3 4 5
HTML parsing and extraction tool Beautiful Soup
Beautiful Soup is a Python library that can extract data from HTML or XML files. Simply put, it can parse HTML tag files into a tree structure and then easily get the corresponding attributes of the specified tags. This feature is similar to lxml. Beautiful Soup installation Beautiful Soup 3 is currently out of development and it is recommended to use Beautiful Soup 4 in your current projects, installed by
Regular Expressions and Python Re Modules
In the data crawl will often use regular expressions, if not familiar with Python’s re module, it is easy to be confused by the various methods inside, today we will review the Python re module. Before learning the Python module, let’s see what the official description documentation says Implementation. 1 2 import re help(re) Helpful information. 1 2 3 4 5 6 7 8 9 10 11 12 13 14
Python Geographic Data Visualization Tool GeoPandas
GeoPandas Introduction GeoPandas is an open source project that aims to make it easier to work with geospatial data under Python. GeoPandas extends the pandas data type to allow spatial operations on geometric types. Geometric operations are performed by shapely. GeoPandas further relies on fiona for file access and descartes, matplotlib for plotting. GeoPandas follows the data types of pandas, so there are also two data types in GeoPandas. GeoSeries
Python geographic data visualization tool mapboxgl jupyter
Introduction to Mapbox Founded in 2010 by Eric Gunderson, Mapbox has grown rapidly and has become a leader in the mapping renaissance. focused on providing custom base map blocks for map and application developers, Mapbox has positioned itself as the leading software company for web maps and mobile applications. In addition to the base map style toolset already widely used by program developers and cartographers, they also offer mapping tools written in Python and JavaScript.
Python Geographic Data Visualization Tool Basemap
Basemap Introduction Basemap is a toolkit under the Python visualization library Matplotlib. its main function is to draw 2D maps, which are important for visualizing spatial data. basemap itself does not do any plotting, but provides the ability to transform coordinates into one of 25 different map projections. Matplotlib can also be used to plot contours, images, vectors, lines, or points in transformed coordinates. basemap includes the GSSH coastline dataset,
Installation and use of FFM/libffm on Windows/Linux
Yu-Chin Juan, the author of FFM, has open-sourced the C++ version of the code libffm on GitHub. Since the daily data processing is in a Python environment, expect to find a Python version of FFM. Related projects on Github There are many on Github, such as this one: A Python wrapper for LibFFM. Installation of libffm in Windows+Anaconda environment Installation of libffm-python package The project is installed on Windows as
Linux system boot speed optimization tool systemd-analyze
Introduction to systemd-analyze systemd-analyze is a tool that comes with Linux to analyze system boot performance.
Commands available for systemd-analyze.
systemd-analyze [OPTIONS…] [time] systemd-analyze [OPTIONS…] blame systemd-analyze [OPTIONS…] critical-chain [UNIT…] * systemd-analyze [OPTIONS…] critical-chain [UNIT… systemd-analyze [OPTIONS…] plot [> file.svg] systemd-analyze [OPTIONS…] dot [PATTERN…] [> file.dot] systemd-analyze [OPTIONS…] dump systemd-analyze [OPTIONS…] set-log-level LEVEL systemd-analyze [OPTIONS…] set-log-target TARGET systemd-analyze [OPTIONS…] get-log-level systemd-analyze [OPTIONS…] get-log-target systemd-analyze [OPTIONS…] syscall-filter [SET…] systemd-analyze [OPTIONS…] verify [FILES…] The systemd-analyze command means.
Linux Timed Tasks Crontab
crond is a daemon process used under linux to periodically perform certain tasks or wait for certain events to be processed, similar to the scheduled tasks under windows, when the installation of the operating system is complete, this service tool will be installed by default and will automatically start the crond process, the crond process will periodically check every minute if there is a task to be performed, and if there is a task to be performed, the task will be automatically executed.