shell bypass 403

GrazzMean-Shell Shell

: /bin/ [ drwxr-xr-x ]
Uname: Linux wputd 5.4.0-200-generic #220-Ubuntu SMP Fri Sep 27 13:19:16 UTC 2024 x86_64
Software: Apache/2.4.41 (Ubuntu)
PHP version: 7.4.3-4ubuntu2.24 [ PHP INFO ] PHP os: Linux
Server Ip: 158.69.144.88
Your Ip: 3.148.104.103
User: www-data (33) | Group: www-data (33)
Safe Mode: OFF
Disable Function:
pcntl_alarm,pcntl_fork,pcntl_waitpid,pcntl_wait,pcntl_wifexited,pcntl_wifstopped,pcntl_wifsignaled,pcntl_wifcontinued,pcntl_wexitstatus,pcntl_wtermsig,pcntl_wstopsig,pcntl_signal,pcntl_signal_get_handler,pcntl_signal_dispatch,pcntl_get_last_error,pcntl_strerror,pcntl_sigprocmask,pcntl_sigwaitinfo,pcntl_sigtimedwait,pcntl_exec,pcntl_getpriority,pcntl_setpriority,pcntl_async_signals,pcntl_unshare,

name : py3compile
#! /usr/bin/python3
# vim: et ts=4 sw=4

# Copyright © 2010-2012 Piotr Ożarowski <piotr@debian.org>
# Copyright © 2010 Canonical Ltd
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.

import logging
import optparse
import os
import struct
import sys
from os import environ, listdir, mkdir
from os.path import dirname, exists, isdir, join
from subprocess import PIPE, Popen
sys.path.insert(1, '/usr/share/python3/')
from debpython.version import SUPPORTED, debsorted, vrepr, \
    get_requested_versions, parse_vrange, getver
from debpython import files as dpf, PUBLIC_DIR_RE, memoize
from debpython.interpreter import Interpreter
from debpython.option import Option, compile_regexpr

# initialize script
logging.basicConfig(format='%(levelname).1s: %(module)s:%(lineno)d: '
                           '%(message)s')
log = logging.getLogger(__name__)
STDINS = {}
WORKERS = {}

"""TODO: move it to manpage
Examples:
    pycompile -p python3-mako # package's public files
    pycompile -p python3-foo /usr/share/foo # package's private files
    pycompile -V 3.1 /usr/lib/python3.1/ # python3.1 only
    pycompile -V 3.1 /usr/lib/foo/bar.py # python3.1 only
    pycompile -V 3.2- /usr/lib/python3/
"""


### EXCLUDES ###################################################
@memoize
def get_exclude_patterns_from_dir(name='/usr/share/python3/bcep/'):
    """Return patterns for files that shouldn't be bytecompiled."""
    if not isdir(name):
        return []

    result = []
    for fn in listdir(name):
        with open(join(name, fn), 'r', encoding='utf-8') as lines:
            for line in lines:
                type_, vrange, dname, pattern = line.split('|', 3)
                vrange = parse_vrange(vrange)
                versions = get_requested_versions(vrange, available=True)
                if not versions:
                    # pattern doesn't match installed Python versions
                    continue
                pattern = pattern.rstrip('\n')
                if type_ == 're':
                    pattern = compile_regexpr(None, None, pattern)
                result.append((type_, versions, dname, pattern))
    return result


def get_exclude_patterns(directory='/', patterns=None, versions=None):
    """Return patterns for files that shouldn't be compiled in given dir."""
    if versions is not None:
        # make sure it's a set (debsorted returns a list)
        versions = set(versions)
    if patterns:
        if versions is None:
            versions = set(SUPPORTED)
        patterns = [('re', versions, directory, i) for i in patterns]
    else:
        patterns = []

    for type_, vers, dname, pattern in get_exclude_patterns_from_dir():
        # skip patterns that do not match requested directory
        if not dname.startswith(directory[:len(dname)]):
            continue
        # skip patterns that do not match requested versions
        if versions and not versions & vers:
            continue
        patterns.append((type_, vers, dname, pattern))
    return patterns


def filter_files(files, e_patterns, compile_versions):
    """Generate (file, versions_to_compile) pairs."""
    for fn in files:
        valid_versions = set(compile_versions)  # all by default

        for type_, vers, dname, pattern in e_patterns:
            if type_ == 'dir' and fn.startswith(dname):
                valid_versions = valid_versions - vers
            elif type_ == 're' and pattern.match(fn):
                valid_versions = valid_versions - vers

            # move to the next file if all versions were removed
            if not valid_versions:
                break
        if valid_versions:
            public_dir = PUBLIC_DIR_RE.match(fn)
            if public_dir and len(public_dir.group(1)) != 1:
                yield fn, set([getver(public_dir.group(1))])
            else:
                yield fn, valid_versions


### COMPILE ####################################################
def py_compile(version, optimize, workers):
    if not isinstance(version, str):
        version = vrepr(version)
    cmd = "/usr/bin/python%s%s -m py_compile -" \
        % (version, ' -O' if optimize else '')
    process = Popen(cmd, bufsize=0, shell=True,
                    stdin=PIPE, close_fds=True)
    workers[version] = process  # keep the reference for .communicate()
    stdin = process.stdin
    while True:
        filename = (yield)
        stdin.write(filename.encode('utf-8') + b'\n')


def compile(files, versions, force, optimize, e_patterns=None):
    global STDINS, WORKERS
    # start Python interpreters that will handle byte compilation
    for version in versions:
        if version not in STDINS:
            coroutine = py_compile(version, optimize, WORKERS)
            next(coroutine)
            STDINS[version] = coroutine

    interpreter = Interpreter('python' if not optimize else 'python -O')

    # byte compile files
    skip_dirs = set()
    for fn, versions_to_compile in filter_files(files, e_patterns, versions):
        for version in versions_to_compile:
            cfn = interpreter.cache_file(fn, version)
            if version == (3, 1):
                if exists(cfn) and not force:
                    ftime = os.stat(fn).st_mtime
                    try:
                        ctime = os.stat(cfn).st_mtime
                    except os.error:
                        ctime = 0
                    if ctime > ftime:
                        continue
            else:
                pycache_dir = dirname(cfn)
                if not force:
                    try:
                        mtime = int(os.stat(fn).st_mtime)
                        expect = struct.pack('<4sl',
                                             interpreter.magic_number(version), mtime)
                        with open(cfn, 'rb') as chandle:
                            actual = chandle.read(8)
                        if expect == actual:
                            continue
                    except (IOError, OSError):
                        pass
                if pycache_dir not in skip_dirs and not exists(pycache_dir):
                    try:
                        mkdir(pycache_dir)
                    except Exception as e:
                        log.error("cannot create directory %s: %r", pycache_dir, e)
                        skip_dirs.add(pycache_dir)
                        continue
            pipe = STDINS[version]
            pipe.send(fn)


################################################################
def main():
    usage = '%prog [-V [X.Y][-][A.B]] DIR_OR_FILE [-X REGEXPR]\n' +\
        '       %prog -p PACKAGE'
    parser = optparse.OptionParser(usage, version='%prog 3.8.2-0ubuntu2',
                                   option_class=Option)
    parser.add_option('-v', '--verbose', action='store_true', dest='verbose',
                      help='turn verbose mode on')
    parser.add_option('-q', '--quiet', action='store_false', dest='verbose',
                      default=False, help='be quiet')
    parser.add_option('-f', '--force', action='store_true', dest='force',
                      default=False,
                      help='force rebuild even if timestamps are up-to-date')
    parser.add_option('-O', action='store_true', dest='optimize',
                      default=False, help="byte-compile to .pyo files")
    parser.add_option('-p', '--package',
                      help='specify Debian package name whose files should be bytecompiled')
    parser.add_option('-V', type='version_range', dest='vrange',
                      help="""force private modules to be bytecompiled
with Python version from given range, regardless of the default Python version
in the system.  If there are no other options, bytecompile all public modules
for installed Python versions that match given range.

VERSION_RANGE examples: '3.1' (version 3.1 only), '3.1-' (version 3.1 or
newer), '3.1-3.3' (version 3.1 or 3.2), '-4.0' (all supported 3.X versions)""")
    parser.add_option('-X', '--exclude', action='append',
                      dest='regexpr', type='regexpr',
                      help='exclude items that match given REGEXPR. \
You may use this option multiple times to build up a list of things to exclude.')

    (options, args) = parser.parse_args()

    if options.verbose or environ.get('PYCOMPILE_DEBUG') == '1':
        log.setLevel(logging.DEBUG)
        log.debug('argv: %s', sys.argv)
        log.debug('options: %s', options)
        log.debug('args: %s', args)
    else:
        log.setLevel(logging.WARN)

    if options.regexpr and not args:
        parser.error('--exclude option works with private directories '
                     'only, please use /usr/share/python3/bcep to specify '
                     'public modules to skip')

    if options.vrange and options.vrange[0] == options.vrange[1] and\
            options.vrange != (None, None) and\
            exists("/usr/bin/python%d.%d" % options.vrange[0]):
        # specific version requested, use it even if it's not in SUPPORTED
        versions = {options.vrange[0]}
    else:
        versions = get_requested_versions(options.vrange, available=True)
    if not versions:
        log.error('Requested versions are not installed')
        exit(3)

    if options.package and args:  # package's private directories
        # get requested Python version
        compile_versions = debsorted(versions)[:1]
        log.debug('compile versions: %s', versions)

        pkg_files = tuple(dpf.from_package(options.package))
        for item in args:
            e_patterns = get_exclude_patterns(item, options.regexpr,
                                              compile_versions)
            if not exists(item):
                log.warn('No such file or directory: %s', item)
            else:
                log.debug('byte compiling %s using Python %s',
                          item, compile_versions)
                files = dpf.filter_directory(pkg_files, item)
                compile(files, compile_versions, options.force,
                        options.optimize, e_patterns)
    elif options.package:  # package's public modules
        # no need to limit versions here, it's either pyr mode or version is
        # hardcoded in path / via -V option
        e_patterns = get_exclude_patterns()
        files = dpf.from_package(options.package)
        files = dpf.filter_public(files, versions)
        compile(files, versions,
                options.force, options.optimize, e_patterns)
    elif args:  # other directories/files
        for item in args:
            e_patterns = get_exclude_patterns(item, options.regexpr, versions)
            files = dpf.from_directory(item)
            compile(files, versions,
                    options.force, options.optimize, e_patterns)
    else:
        parser.print_usage()
        exit(1)

    # wait for all processes to finish
    rv = 0
    for process in WORKERS.values():
        process.communicate()
        if process.returncode not in (None, 0):
            rv = process.returncode
    exit(rv)

if __name__ == '__main__':
    main()
© 2025 GrazzMean-Shell
Software Development Services Archives - Page 11 of 12 - Michigan AI Application Development - Best Microsoft C# Developers & Technologists

Tech Blog

Tech Insights, Information, and Inspiration

Get In Touch

10 + 11 =

UseTech Design, LLC
TROY, MI • BLOOMFIELD HILLS, MI
Call or text +1(734) 367-4100

Approaching AI: How Today’s Businesses Can Harness Its Capabilities

Artificial Intelligence (AI) has transitioned from being a speculative concept in science fiction to a transformative force across numerous industries. Among the most intriguing aspects of AI are AI agents, which are software entities that perform tasks on behalf of users. Understanding AI agents in real-world terms involves examining their components, capabilities, applications, and the ethical considerations they raise.

AI Agents: Bridging the Gap Between Technology and Real-World Applications

Among the most intriguing aspects of AI are AI agents, which are software entities that perform tasks on behalf of users. Understanding AI agents in real-world terms involves examining their components, capabilities, applications, and the ethical considerations they raise.

Utilizing AI Agents for Effective Legacy Code Modernization

As companies strive to keep pace with innovation, the modernization of legacy code becomes imperative. Artificial Intelligence (AI) agents offer a compelling solution to this problem, providing sophisticated tools and methodologies to facilitate the transition from legacy systems to modern architectures.

Embracing the Future: How AI Agents Will Change Everything

The future with AI agent technology holds immense promise for transforming our world in profound and unprecedented ways. From personalized experiences and seamless integration into daily life to empowering human-computer collaboration and revolutionizing healthcare, AI agents are poised to redefine the way we live, work, and interact with technology.

AI Agents vs. Traditional Customer Support: A Comparative Analysis

While traditional support offers a human touch and emotional connection, AI agents provide scalability, efficiency, and 24/7 availability. Moving forward, businesses must carefully assess their unique needs and customer expectations to determine the optimal balance between AI-driven automation and human interaction.

The Future of Business Intelligence: AI Solutions for Data-driven Decision Making

The future of business intelligence is AI-powered, where data becomes not just a strategic asset but a competitive advantage. In today’s hyper-connected digital world, data has become the lifeblood of business operations. Every click, purchase, and interaction generates valuable information that, when analyzed effectively, can provide crucial insights for strategic decision-making.

Democratized AI: Making Artificial Intelligence Accessible to All

Democratized AI has the potential to revolutionize industries and improve society by making AI technologies more accessible and inclusive. However, it also presents challenges such as data privacy, bias, and ethical considerations that must be addressed to ensure responsible implementation.

Explainable AI (XAI): Techniques and Methodologies within the Field of AI

Imagine a black box. You feed data into it, and it spits out a decision. That’s how many AI systems have traditionally functioned. This lack of transparency can be problematic, especially when it comes to trusting the AI’s reasoning. This is where Explainable AI (XAI) comes in.

Building an AI-Ready Workforce: Key Skills and Training Strategies

As artificial intelligence (AI) continues to transform industries and reshape the employment landscape, the demand for a skilled AI-ready workforce intensifies. Organizations across various sectors are recognizing the imperative of equipping their employees with the necessary skills and knowledge to thrive in an AI-driven world.

Working Together: Approaches to Multi-agent Collaboration in AI

Imagine a team of specialists – a data whiz, a communication expert, and an action master – all working in sync. This is the power of multi-agent collaboration, with the potential to revolutionize fields like scientific discovery, robotics, and self-driving cars. But getting these AI agents to collaborate effectively presents unique challenges