Snippets Collections
The landscape of iOS app development is continually evolving, driven by Apple's consistent innovations in hardware and software. Developers are constantly adopting new tools and paradigms to build more sophisticated, intuitive, and secure applications.

Here's the some of the latest trends and technologies that shaping iOS app development process:

◦ SwiftUI's Continued Dominance: Apple's declarative UI framework, SwiftUI, is maturing rapidly. It allows developers to build user interfaces across all Apple platforms (iOS, iPadOS, macOS, watchOS, tvOS, and even visionOS) with less code and in a more intuitive way. Its integration with Xcode previews streamlines the design process.

◦ Artificial Intelligence (AI) and Machine Learning (ML) Integration: Core ML, Create ML, and advancements in the Neural Engine allow developers to embed powerful AI/ML capabilities directly into apps. This enables features like intelligent recommendations, advanced image recognition, natural language processing, and smarter personal assistants, often with enhanced privacy as processing occurs on-device.

◦ Augmented Reality (AR) with ARKit: AR experiences continue to become more immersive and integrated into various app categories, from gaming and retail (virtual try-ons) to education and healthcare. ARKit's ongoing enhancements provide developers with robust tools to create compelling AR content.

◦ Enhanced Privacy and Security: Apple's strong emphasis on user privacy remains a core trend. Features like App Tracking Transparency (ATT) and Passkeys are pushing developers to build apps with privacy-by-design, focusing on transparent data handling and secure authentication.

◦ Spatial Computing and VisionOS: With the advent of Apple Vision Pro, spatial computing is becoming a significant area for developers. While still nascent, creating apps that seamlessly blend digital content with the real world or offer fully immersive experiences represents a new frontier for iOS developers.

◦ Swift Concurrency and Performance Optimization: The adoption of Swift's structured concurrency features (async/await, Actors) is improving the performance and reliability of complex iOS applications by simplifying asynchronous code execution and preventing common concurrency bugs.

◦ Widgets, Live Activities, and App Clips: Expanding beyond the main app, developers are leveraging widgets for at-a-glance information, Live Activities for real-time updates directly on the Lock Screen or Dynamic Island, and App Clips for lightweight, on-demand app experiences without full downloads.

These trends collectively aim to deliver more personal, intelligent, and interconnected user experiences across the Apple ecosystem.

Looking to incorporate these cutting-edge advanced technologies into your next project? Then Appticz is the best for the entrepreneurs who are looking to innovative iOS app development company, Utilizing these latest trends and technologies to build your responsive solutions that drive exceptional user engagement and business growth.
SELECT 
  Id, 
  PermissionsRead, 
  PermissionsEdit, 
  SobjectType, 
  Field, 
  Parent.Type, 
  Parent.Name, 
  Parent.PermissionSetGroup.DeveloperName, 
  Parent.Profile.Name 
FROM FieldPermissions
WHERE Field = 'Object__c.Field__c'
create table team_kingkong.mid_limits_shivam as (
WITH latest_limits AS (
    SELECT
        merchantid,
        identifier,
        maxamtpermonth,
        ROW_NUMBER() OVER (PARTITION BY merchantid, identifier ORDER BY modifieddate DESC) AS rn
    FROM (
        SELECT
            merchantid,
            identifier,
            maxamtpermonth,
            (DATE_DIFF('millisecond', TIMESTAMP '1970-01-01 5:30:00', CAST(modifieddate AS TIMESTAMP))) AS modifieddate
        FROM merchant_velocity.instrument_historic_data_snapshot_v3
        WHERE dl_last_updated >= DATE '2010-01-01'
        
        UNION ALL
        
        SELECT
            merchantid,
            identifier,
            maxamtpermonth,
            modifieddate
        FROM TP_S_2022_MD_EVENTLOG_001.TP_S_2022_MD_EVENTLOG_001_snapshot_v3
        WHERE dl_last_updated >= DATE '2010-01-01'
    )
),

pivoted_limits AS (
    SELECT
        merchantid,
        MAX(CASE WHEN identifier = 'UPI_CC' THEN maxamtpermonth END) AS UPI_CC_limit,
        MAX(CASE WHEN identifier = 'UPI' THEN maxamtpermonth END) AS UPI_limit,
        MAX(CASE WHEN identifier = 'CC' THEN maxamtpermonth END) AS CC_limit,
        MAX(CASE WHEN identifier = 'DC' THEN maxamtpermonth END) AS DC_limit,
        MAX(CASE WHEN identifier = 'UPI_CREDITLINE' THEN maxamtpermonth END) AS UPI_CREDITLINE_limit,
        MAX(CASE WHEN identifier = 'PER_MID' THEN maxamtpermonth END) AS overall_limit
    FROM latest_limits
    WHERE rn = 1
    GROUP BY merchantid
),

merchant_types AS (
    SELECT 
        v1.merchantid,
        CASE 
            WHEN o_mid IS NOT NULL THEN 'Online'
            WHEN e_mid IS NOT NULL THEN 'EDC' 
            ELSE 'QR' 
        END AS EDC_QR
    FROM pivoted_limits v1
    LEFT JOIN (
        SELECT DISTINCT merchant_id AS o_mid
        FROM datalake.online_payment_merchants
    ) m_3 ON v1.merchantid = m_3.o_mid
    LEFT JOIN (
        SELECT DISTINCT mid AS e_mid
        FROM paytmpgdb.entity_edc_info_snapshot_v3
        WHERE terminal_status = 'ACTIVE'
        AND dl_last_updated >= DATE '2010-01-01'
    ) m_4 ON v1.merchantid = m_4.e_mid
)
select * from (
SELECT
    p.merchantid,
    m.EDC_QR,
    CAST(p.UPI_CC_limit AS double)/100 AS UPI_CC_limit,
    CAST(p.UPI_limit AS double)/100 AS UPI_limit,
    CAST(p.CC_limit AS double)/100 AS CC_limit,
    CAST(p.DC_limit AS double)/100 AS DC_limit,
    CAST(p.UPI_CREDITLINE_limit AS double)/100 AS UPI_CREDITLINE_limit,
    CAST(p.overall_limit AS double)/100 AS overall_limit
    -- CASE 
    --     WHEN (p.overall_limit IS NULL OR p.overall_limit = -1) 
    --     THEN '1' ELSE '0' 
    -- END AS unlimited_overall_limit_flag,
    
    -- CASE 
    --     WHEN (p.overall_limit IS NULL OR p.overall_limit = -1) OR (p.UPI_CC_limit IS NULL OR p.UPI_CC_limit = -1) THEN 'NA'
    --     WHEN p.UPI_CC_limit > p.overall_limit THEN '1' 
    --     ELSE '0' 
    -- END AS UPI_CC_limit_ov_lmt_flag,

    -- CASE 
    --     WHEN (p.overall_limit IS NULL OR p.overall_limit = -1) OR (p.UPI_limit IS NULL OR p.UPI_limit = -1) THEN 'NA'
    --     WHEN p.UPI_limit > p.overall_limit THEN '1' 
    --     ELSE '0' 
    -- END AS UPI_limit_ov_lmt_flag,

    -- CASE 
    --     WHEN (p.overall_limit IS NULL OR p.overall_limit = -1) OR (p.CC_limit IS NULL OR p.CC_limit = -1) THEN 'NA'
    --     WHEN p.CC_limit > p.overall_limit THEN '1' 
    --     ELSE '0' 
    -- END AS CC_limit_ov_lmt_flag,

    -- CASE 
    --     WHEN (p.overall_limit IS NULL OR p.overall_limit = -1) OR (p.DC_limit IS NULL OR p.DC_limit = -1) THEN 'NA'
    --     WHEN p.DC_limit > p.overall_limit THEN '1' 
    --     ELSE '0' 
    -- END AS DC_limit_ov_lmt_flag,

    -- CASE 
    --     WHEN (p.overall_limit IS NULL OR p.overall_limit = -1) OR (p.UPI_CREDITLINE_limit IS NULL OR p.UPI_CREDITLINE_limit = -1) THEN 'NA'
    --     WHEN p.UPI_CREDITLINE_limit > p.overall_limit THEN '1' 
    --     ELSE '0' 
    -- END AS UPI_CREDITLINE_limit_ov_lmt_flag,

    -- CASE 
    --     WHEN (p.UPI_limit IS NULL OR p.UPI_limit = -1) OR (p.UPI_CC_limit IS NULL OR p.UPI_CC_limit = -1) THEN 'NA'
    --     WHEN p.UPI_CC_limit > p.UPI_limit THEN '1' 
    --     ELSE '0' 
    -- END AS UPI_CC_limit_UPI_limit_flag,

    -- CASE 
    --     WHEN (p.UPI_limit IS NULL OR p.UPI_limit = -1) OR (p.UPI_CREDITLINE_limit IS NULL OR p.UPI_CREDITLINE_limit = -1) THEN 'NA'
    --     WHEN p.UPI_CREDITLINE_limit > p.UPI_limit THEN '1' 
    --     ELSE '0' 
    -- END AS UPI_CREDITLINE_limit_UPI_limit_flag
FROM pivoted_limits p
JOIN merchant_types m ON p.merchantid = m.merchantid));
-- where UPI_CC_limit_ov_lmt_flag = '1'  OR UPI_limit_ov_lmt_flag = '1'
--     OR CC_limit_ov_lmt_flag = '1' OR DC_limit_ov_lmt_flag = '1'
--     OR UPI_CREDITLINE_limit_ov_lmt_flag = '1' OR UPI_CC_limit_UPI_limit_flag = '1'
function updateSelect(e) {
    const { target } = e;

    const value = target.value;

    const cadenceRow = inputCadence.closest(".form__row");
    const elevationRow = inputElevation.closest(".form__row");

    // Remove the hidden class from both rows first
    cadenceRow.classList.remove("form__row--hidden");
    elevationRow.classList.remove("form__row--hidden");

    const selected = {
      cycling: elevationRow,
      running: cadenceRow,
    };

    selected[value].classList.add("form__row--hidden");
  }
-- RISK 306
-- If payer account and payee vpa count of p2p transactions in previous 24 hours is more than equal to 10 then BLOCK

-- CREATE TABLE team_kingkong.tpap_risk306_breaches AS
INSERT INTO team_kingkong.tpap_risk306_breaches
with tpap_base as
(
SELECT DISTINCT B.*, C.category
, IF(D.upi_subtype IS NOT NULL, D.upi_subtype, IF(C.category = 'LITE_MANDATE', 'UPI_LITE_MANDATE', '')) AS upi_subtype
FROM
    (SELECT txn_id, scope_cust_id,
    MAX(CASE WHEN participant_type = 'PAYER' THEN vpa END) AS payer_vpa,
    MAX(CASE WHEN participant_type = 'PAYEE' THEN vpa END) AS payee_vpa,
    MAX(created_on) as txn_date,
    MAX(amount) AS txn_amount,
    created_on AS txn_time
    FROM switch.txn_participants_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    AND DATE(created_on) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    AND vpa IS NOT NULL
    GROUP BY 1,2,7)B
inner join
    (select txn_id, category
    from switch.txn_info_snapshot_v3
    where DATE(dl_last_updated) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    and DATE(created_on) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    and upper(status) in ('SUCCESS')) C
on B.txn_id = C.txn_id
INNER JOIN
    (SELECT txnid
    , regexp_replace(cast(json_extract(request, '$.evaluationType') as varchar), '"', '') AS upi_subtype
    FROM tpap_hss.upi_switchv2_dwh_risk_data_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN date'2025-05-01' AND DATE'2025-05-31'
    AND (lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) LIKE '%@paytm%'
    or lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) like '%@pt%')
    AND json_extract_scalar(response, '$.action_recommended') <> 'BLOCK'
    AND regexp_replace(cast(json_extract(request, '$.requestPayload.payerType') AS varchar),'"','') = 'PERSON'
    AND regexp_replace(cast(json_extract(request, '$.requestPayload.payeeType') AS varchar),'"','') = 'ENTITY')D
ON B.txn_id = D.txnid
WHERE (payer_vpa LIKE '%@paytm%') OR (payer_vpa LIKE '%@pt%')
AND payee_vpa LIKE '%@%' AND payee_vpa <> 'onpaytmgas@paytm'
)
 
SELECT * FROM
    (SELECT t1.payer_vpa,
      t1.payee_vpa,
      t1.txn_id,
      t1.txn_amount,
      t1.category,
      t1.upi_subtype,
      t1.txn_time,
      DATE(t1.txn_time) AS txn_date,
      COUNT(t2.txn_id) AS prior_txns_last_24h,
      15 as threshold
    FROM tpap_base t1
    INNER JOIN tpap_base t2
      ON t1.payer_vpa = t2.payer_vpa
      AND t1.payee_vpa = t2.payee_vpa
      AND t2.txn_time BETWEEN (t1.txn_time - INTERVAL '86400' SECOND) AND t1.txn_time
      AND t1.txn_id <> t2.txn_id
    GROUP BY t1.payer_vpa, t1.payee_vpa, t1.txn_id, t1.txn_amount, t1.category, t1.upi_subtype, t1.txn_time, DATE(t1.txn_time))
WHERE prior_txns_last_24h > threshold
;
-- RISK005
-- 50 Txn limit of one payer in 24 hours

-- CREATE TABLE team_kingkong.tpap_risk005_breaches AS
INSERT INTO team_kingkong.tpap_risk005_breaches
with tpap_base as
(SELECT DISTINCT B.*, C.category
, IF(D.upi_subtype IS NOT NULL, D.upi_subtype, IF(C.category = 'LITE_MANDATE', 'UPI_LITE_MANDATE', '')) AS upi_subtype
FROM
    (SELECT txn_id, scope_cust_id,
    MAX(CASE WHEN participant_type = 'PAYER' THEN vpa END) AS payer_vpa,
    MAX(CASE WHEN participant_type = 'PAYEE' THEN vpa END) AS payee_vpa,
    MAX(CASE WHEN participant_type = 'PAYER' THEN mobile_no END) AS payer_mobile_no,
    MAX(created_on) as txn_date,
    MAX(amount) AS txn_amount,
    created_on AS txn_time
    FROM switch.txn_participants_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN DATE'2025-01-01' AND DATE'2025-01-31' -- run for apr & may next
    AND DATE(created_on) BETWEEN DATE'2025-01-01' AND DATE'2025-01-31'
    AND vpa IS NOT NULL
    GROUP BY 1,2,8)B
inner join
    (select txn_id, category
    from switch.txn_info_snapshot_v3
    where DATE(dl_last_updated) BETWEEN DATE'2025-01-01' AND DATE'2025-01-31'
    and DATE(created_on) BETWEEN DATE'2025-01-01' AND DATE'2025-01-31'
    and upper(status) in ('SUCCESS')) C
on B.txn_id = C.txn_id
INNER JOIN
    (SELECT txnid
    , regexp_replace(cast(json_extract(request, '$.evaluationType') as varchar), '"', '') AS upi_subtype
    FROM tpap_hss.upi_switchv2_dwh_risk_data_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN date'2025-01-01' AND DATE'2025-01-31' -- run for jan next
    AND (lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) LIKE '%@paytm%'
    or lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) like '%@pt%')
    AND json_extract_scalar(response, '$.action_recommended') <> 'BLOCK'
    AND regexp_replace(cast(json_extract(request, '$.requestPayload.payerType') AS varchar),'"','') = 'PERSON')D
ON B.txn_id = D.txnid
WHERE (payer_vpa LIKE '%@paytm%') OR (payer_vpa LIKE '%@pt%'))
 
SELECT * FROM
    (SELECT t1.payer_vpa,
      t1.payee_vpa,
      t1.payer_mobile_no,
      t1.txn_id,
      t1.txn_amount,
      t1.category,
      t1.upi_subtype,
      t1.txn_time,
      DATE(t1.txn_time) AS txn_date,
      COUNT(t2.txn_id) AS prior_txns_last_24h,
      50 as threshold
    FROM tpap_base t1
    INNER JOIN tpap_base t2
      ON t1.payer_mobile_no = t2.payer_mobile_no
      AND t2.txn_time BETWEEN (t1.txn_time - INTERVAL '86400' SECOND) AND t1.txn_time
      AND t1.txn_id <> t2.txn_id
    GROUP BY t1.payer_vpa, t1.payee_vpa, t1.payer_mobile_no, t1.txn_id, t1.txn_amount, t1.category, t1.upi_subtype, t1.txn_time, DATE(t1.txn_time))
WHERE prior_txns_last_24h > threshold
;
-- TPAP: RSIK_127
-- UPI Credit card Block txn if Payee name = Payer Name
-- "if ((payerAccountName == payeeAccountName) ???
-- && (payerAccountType == ""CREDIT"") 
-- && (payerType == ""PERSON"")) {
-- ""BLOCK""}"\
DROP TABLE team_kingkong.tpap_risk127_breaches;

-- CREATE TABLE team_kingkong.tpap_risk127_breaches AS
INSERT INTO team_kingkong.tpap_risk127_breaches
SELECT B.*, C.category
, IF(D.upi_subtype IS NOT NULL, D.upi_subtype, IF(C.category = 'LITE_MANDATE', 'UPI_LITE_MANDATE', '')) AS upi_subtype
FROM
    (SELECT txn_id, scope_cust_id,
    MAX(CASE WHEN participant_type = 'PAYER' THEN vpa END) AS payer_vpa,
    MAX(CASE WHEN participant_type = 'PAYEE' THEN vpa END) AS payee_vpa,
    MAX(created_on) as txn_date,
    MAX(amount) AS txn_amount,
    created_on AS txn_time
    FROM switch.txn_participants_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    AND DATE(created_on) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    AND vpa IS NOT NULL
    GROUP BY 1,2,7)B
inner join
    (select txn_id, category
    from switch.txn_info_snapshot_v3
    where DATE(dl_last_updated) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    and DATE(created_on) BETWEEN DATE'2025-05-01' AND DATE'2025-05-31'
    and upper(status) in ('SUCCESS')) C
on B.txn_id = C.txn_id
inner JOIN
    (SELECT DISTINCT txnid
    , regexp_replace(cast(json_extract(request, '$.evaluationType') as varchar), '"', '') AS upi_subtype
    , lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerAccountType') as varchar), '"', '')) as payerAccountType
    , lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerType') as varchar), '"', '')) as payerType
    , regexp_replace(cast(json_extract(request, '$.requestPayload.payerName') as varchar), '"', '') as payerName
    , regexp_replace(cast(json_extract(request, '$.requestPayload.payeeName') as varchar), '"', '') as payeeName
    FROM tpap_hss.upi_switchv2_dwh_risk_data_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN date'2025-05-01' AND DATE'2025-05-31'
    AND (lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) LIKE '%@paytm%'
    or lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) like '%@pt%')
    AND lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerAccountType') as varchar), '"', '')) = 'credit'
    AND lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerType') as varchar), '"', '')) = 'person'
    AND lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payeeName') as varchar), '"', '')) = lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerName') as varchar), '"', ''))
    AND json_extract_scalar(response, '$.action_recommended') <> 'BLOCK'
    )D
ON B.txn_id = D.txnid;
import os
from PIL import Image
from tkinter import Tk, filedialog

def process_image(input_path, output_folder, quality=75):
    filename = os.path.basename(input_path)
    name, ext = os.path.splitext(filename)
    ext = ext.lower()

    # Definir nombre de salida (siempre .jpg)
    output_path = os.path.join(output_folder, f"{name}.jpg")

    with Image.open(input_path) as img:
        if ext in ['.jpg', '.jpeg']:
            # Solo comprimir JPEG
            img.save(output_path, format='JPEG', quality=quality, optimize=True)
        else:
            # Convertir a JPEG y comprimir
            img = img.convert("RGB")  # Convertir de RGBA/PNG a RGB
            img.save(output_path, format='JPEG', quality=quality, optimize=True)

    print(f"Processed: {filename} → {output_path}")

def compress_images_in_folder(folder_path, quality=75):
    output_folder = os.path.join(folder_path, "compressed_jpeg")
    os.makedirs(output_folder, exist_ok=True)

    for filename in os.listdir(folder_path):
        if filename.lower().endswith(('.jpg', '.jpeg', '.png', '.bmp', '.tiff', '.webp')):
            input_path = os.path.join(folder_path, filename)
            process_image(input_path, output_folder, quality)

    print(f"\n✅ Finished. Compressed images saved in: {output_folder}")

def choose_folder():
    root = Tk()
    root.withdraw()
    return filedialog.askdirectory(title="Select folder with images to convert & compress")

if __name__ == "__main__":
    folder = choose_folder()
    if folder:
        compress_images_in_folder(folder, quality=75)
    else:
        print("No folder selected.")
<input type="text" placeholder="Search..."
value="<?php echo isset($_GET['s_q_fulltext']) ? esc_attr($_GET['s_q_fulltext']) : ''; ?>"
name="s_q_fulltext">

  <?php
$keyword = isset($_GET['s_q_fulltext']) ? sanitize_text_field($_GET['s_q_fulltext']) : '';

global $wpdb;
$keyword = esc_sql($keyword);
$sql = "
SELECT DISTINCT p.ID
FROM {$wpdb->posts} p
LEFT JOIN {$wpdb->postmeta} pm ON p.ID = pm.post_id
WHERE p.post_type IN ('post', 'page')
AND p.post_status = 'publish'
AND (
  p.post_title LIKE '%$keyword%' OR
  p.post_content LIKE '%$keyword%' OR
  pm.meta_value LIKE '%$keyword%'
)
";

$post_ids = $wpdb->get_col($sql);

$args = array(
  'post_type' => ['post', 'page'],
  //'s' => $keyword,
  'post__in' => $post_ids,
);

  $search_query = new WP_Query($args);

?>

  <?php if (!empty($keyword)): ?>
    <?php if ($search_query->have_posts()): ?>


      <div data-pad="2" class="me-block me-PanelCol search-height me-max-width"
id="UwYEobJ5xSOSFJC5JLMkxXA" data-mod="MEBuild2.ParallaxScroll" data-opt="{}">
  <div class="underlay"></div>
<div class="overlay"></div>
<div class="row me-max-width collapse">
  <div class="column me-iwrap small-12 medium-order-1">
    <div class="overlay"></div>
<div class="underlay"></div>
<div data-pad="0" class="me-block me-SearchSolrFilterResults"
id="U53w2SU9WSFSjovg3pydCww">
  <div class="row collapse me-max-width small-up-1">

    <?php while ($search_query->have_posts()):
    $search_query->the_post(); ?>
      <?php get_template_part('template-parts/content', 'search'); ?>
        <?php endwhile; ?>
          <?php the_posts_navigation(); ?>
            <?php wp_reset_postdata(); ?>

              </div>
</div>
</div>
</div>
</div>
<?php else: ?>
  <?php get_template_part('template-parts/content', 'none'); ?>
    <?php endif;
document.addEventListener('DOMContentLoaded', function () {
		const input = document.querySelector('input[name="s_q_fulltext"]');
		input.addEventListener('keydown', function (event) {
			if (event.key === 'Enter') {
				event.preventDefault();
				submitSearch();
			}
		});
	});

	function submitSearch() {
		const keyword = document.querySelector('input[name="s_q_fulltext"]').value;
		const url = '<?php echo esc_url(home_url('/search')); ?>' + '?s_q_fulltext=' + encodeURIComponent(keyword || '');
		window.location.href = url;
	}
<?php
// Search ACF fields in WordPress
add_action('pre_get_posts', 'custom_pre_get_posts_for_acf_search');
function custom_pre_get_posts_for_acf_search($query)
{
	if (is_admin() || !$query->is_main_query() || !$query->is_search()) {
		return;
	}

	if (isset($_GET['s_q_fulltext']) && !empty($_GET['s_q_fulltext'])) {
		$query->set('s', sanitize_text_field($_GET['s_q_fulltext']));
	}
}


function custom_search_acf_fields($where, $query)
{
	if (is_admin() || !$query->is_main_query() || !$query->is_search()) {
		return $where;
	}

	global $wpdb;

	$search_term = $query->get('s');
	if (empty($search_term))
		return $where;

	$like = '%' . $wpdb->esc_like($search_term) . '%';

	$where .= $wpdb->prepare("
		OR EXISTS (
			SELECT 1 FROM $wpdb->postmeta
			WHERE $wpdb->postmeta.post_id = $wpdb->posts.ID
			AND $wpdb->postmeta.meta_value LIKE %s
		)
	", $like);

	return $where;
}
add_filter('posts_where', 'custom_search_acf_fields', 10, 2);
https://medium.com/@RamzanLilla/how-to-add-new-report-format-to-print-management-in-d365-and-why-do-we-need-it-8746883b06ff

INSERT INTO team_kingkong.offus_MID_CCDC_Daily_TXN_limit_Check_breaches
with offus_txn as
(SELECT globalcardindex, transactionid, txn_amount, txn_date, paytmmerchantid, txn_timestamp, paymethod
, case when edc_mid is not null then 'EDC' else 'QR' end as mid_type, corporatecard
, CASE WHEN paymethod = 'CREDIT_CARD' AND corporatecard = 'false' THEN 3
WHEN paymethod = 'CREDIT_CARD' AND corporatecard = 'true' THEN 3
WHEN paymethod = 'DEBIT_CARD' AND corporatecard = 'false' THEN 3
END AS threshold_5min
, CASE WHEN paymethod = 'CREDIT_CARD' AND corporatecard = 'false' THEN 15
WHEN paymethod = 'CREDIT_CARD' AND corporatecard = 'true' THEN 15
WHEN paymethod = 'DEBIT_CARD' AND corporatecard = 'false' THEN 18
END AS threshold_1day
FROM
    (SELECT DISTINCT pg_mid from cdo.total_offline_merchant_base_snapshot_v3) f
INNER join
    (select distinct transactionid
    , cast(eventamount as double)/100 as txn_amount
    , paytmmerchantid
    , globalcardindex
    , DATE(dl_last_updated) AS txn_date
    , CAST(velocitytimestamp AS DOUBLE) AS txn_timestamp
    , paymethod
    from cdp_risk_transform.maquette_flattened_offus_snapshot_v3
    where dl_last_updated BETWEEN DATE(DATE'2025-03-01' - INTERVAL '1' DAY) AND DATE'2025-03-31' -- BETWEEN date'2025-03-31' AND
    and paymethod in ('CREDIT_CARD','DEBIT_CARD')
    AND actionrecommended <> 'BLOCK') a
on a.paytmmerchantid = f.pg_mid
LEFT JOIN
    (SELECT DISTINCT mid AS edc_mid FROM paytmpgdb.entity_edc_info_snapshot_v3
    WHERE terminal_status = 'ACTIVE' AND dl_last_updated >= DATE '2010-01-01') b
ON a.paytmmerchantid = b.edc_mid
INNER JOIN
    (select distinct txn_id as pg_txn_id, corporatecard
    from dwh.pg_olap
    where ingest_date BETWEEN DATE'2025-03-01' AND DATE(DATE'2025-03-31' + INTERVAL '1' DAY) -- BETWEEN date'2025-03-31' AND
    and txn_started_at BETWEEN  DATE'2025-03-01' AND DATE(DATE'2025-03-31' + INTERVAL '1' DAY) -- BETWEEN date'2025-03-31' AND
    and txn_status = 'SUCCESS') d
on a.transactionid = d.pg_txn_id
WHERE paymethod = 'CREDIT_CARD' OR (paymethod = 'DEBIT_CARD' AND corporatecard = 'false')
)


SELECT * FROM
    (SELECT A.globalcardindex, A.transactionid, A.txn_amount, A.txn_date, A.paytmmerchantid, A.txn_timestamp
    , A.mid_type, A.paymethod, A.corporatecard
    , A.threshold_5min
    , A.threshold_1day
    , COUNT(IF((A.txn_timestamp - B.txn_timestamp) BETWEEN 0 AND 300000, B.transactionid, NULL)) AS txn5_min
    , COUNT(B.transactionid) as txn1_day
    , 'edc_card_velocity_amount' AS rule_name
    FROM
        (SELECT * FROM offus_txn
        WHERE txn_date BETWEEN DATE'2025-03-01' AND  DATE'2025-03-31')A
    INNER JOIN
        (SELECT * FROM offus_txn)B
    ON A.globalcardindex = b.globalcardindex AND A.paytmmerchantid = B.paytmmerchantid
    AND A.transactionid <> B.transactionid
    AND (A.txn_timestamp - B.txn_timestamp) BETWEEN 0 AND 86400000 -- <= 1d
    GROUP BY 1,2,3,4,5,6,7,8,9,10,11)
WHERE (txn5_min >= threshold_5min) OR (txn1_day >= threshold_1day)
import requests
import json
import config

def get_bundles(tokenAddress: str):
    url = "https://api.syrax.ai/v1/token/bundle"

    querystring = {"token":tokenAddress}

    response = requests.request("GET", url, params=querystring)

    text = json.loads(response.text)

    total_tokens = 0
    total_sol = 0
    for bundles in range(0, len(text['bundles'])):
        for trades in range(0, len(text['bundles'][bundles]['trades'])):
            total_sol += text['bundles'][bundles]['trades'][trades]['sol_amount']
            total_tokens += text['bundles'][bundles]['trades'][trades]['token_amount']
    return round(total_sol,1), round((total_tokens/config.pumpfun_supply)*100,2)

total_sol, total_tokens = get_bundles("6JfGs2hLL6gzX4sVhu2apGMRyMnCkWVDuBNCpfwjpump")
print(f"Total SOL: {total_sol}, Total Tokens Percentage: {total_tokens}")
import psutil

percent = psutil.sensors_battery().percent
full_charge = 100
indicator_len = 4

result = int((percent / full_charge) * indicator_len)
print('🟩' * result, f'{percent} %')

# example result
# 🟩🟩🟩 76 %
-- RISK 304
-- If payer account and payee vpa count of p2p transactions in previous 24 hours is more than equal to 10 then BLOCK

-- CREATE TABLE team_kingkong.tpap_risk304_breaches AS
INSERT INTO team_kingkong.tpap_risk304_breaches
with tpap_base as
(
SELECT DISTINCT B.*, C.category
, IF(D.upi_subtype IS NOT NULL, D.upi_subtype, IF(C.category = 'LITE_MANDATE', 'UPI_LITE_MANDATE', '')) AS upi_subtype
FROM
    (SELECT txn_id, scope_cust_id,
    MAX(CASE WHEN participant_type = 'PAYER' THEN vpa END) AS payer_vpa,
    MAX(CASE WHEN participant_type = 'PAYEE' THEN vpa END) AS payee_vpa,
    MAX(created_on) as txn_date,
    MAX(amount) AS txn_amount,
    created_on AS txn_time
    FROM switch.txn_participants_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN DATE'2025-03-01' AND DATE'2025-03-31'
    AND DATE(created_on) BETWEEN DATE'2025-03-01' AND DATE'2025-03-31'
    AND vpa IS NOT NULL
    GROUP BY 1,2,7)B
inner join
    (select txn_id, category
    from switch.txn_info_snapshot_v3
    where DATE(dl_last_updated) BETWEEN DATE'2025-03-01' AND DATE'2025-03-31'
    and DATE(created_on) BETWEEN DATE'2025-03-01' AND DATE'2025-03-31'
    and upper(status) in ('SUCCESS')) C
on B.txn_id = C.txn_id
LEFT JOIN
    (
        SELECT txnid
    , regexp_replace(cast(json_extract(request, '$.evaluationType') as varchar), '"', '') AS upi_subtype
    FROM tpap_hss.upi_switchv2_dwh_risk_data_snapshot_v3
    WHERE DATE(dl_last_updated) BETWEEN date'2025-03-01' AND DATE'2025-03-31'
    AND (lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) LIKE '%@paytm%'
    or lower(regexp_replace(cast(json_extract(request, '$.requestPayload.payerVpa') as varchar), '"', '')) like '%@pt%')
    AND json_extract_scalar(response, '$.action_recommended') <> 'BLOCK'
    AND regexp_replace(cast(json_extract(request, '$.requestPayload.payerType') AS varchar),'"','') = 'PERSON'
    AND regexp_replace(cast(json_extract(request, '$.requestPayload.payeeType') AS varchar),'"','') = 'PERSON'
    )D
ON B.txn_id = D.txnid
WHERE (payer_vpa LIKE '%@paytm%') OR (payer_vpa LIKE '%@pt%') -- OR (payee_vpa LIKE '%@pt%') OR (payee_vpa LIKE '%@paytm%')
AND payee_vpa LIKE '%@%' AND payee_vpa <> ''
)
 
SELECT * FROM
    (SELECT t1.payer_vpa,
      t1.payee_vpa,
      t1.txn_id,
      t1.txn_amount,
      t1.category,
      t1.upi_subtype,
      t1.txn_time,
      DATE(t1.txn_time) AS txn_date,
      COUNT(t2.txn_id) AS prior_txns_last_24h,
      10 as threshold
    FROM tpap_base t1
    INNER JOIN tpap_base t2
      ON t1.payer_vpa = t2.payer_vpa
      AND t1.payee_vpa = t2.payee_vpa
      AND t2.txn_time BETWEEN (t1.txn_time - INTERVAL '86400' SECOND) AND t1.txn_time
      AND t1.txn_id <> t2.txn_id
    GROUP BY t1.payer_vpa, t1.payee_vpa, t1.txn_id, t1.txn_amount, t1.category, t1.upi_subtype, t1.txn_time, DATE(t1.txn_time))
WHERE prior_txns_last_24h > threshold
;

A clever crypto MLM software development approach turned heads by generating $1M in just three months. The key? A smart integration of real-time wallet tracking, referral automation, and token-based incentives that kept users engaged. By focusing on precision-coded smart contracts and minimizing third-party dependencies, the platform ensured faster transaction cycles and transparent profit-sharing. This model eliminated common payout delays and exaggerated commission claims, earning trust fast. Developers also optimized gas usage, attracting cost-conscious users. If you’re entering the crypto MLM space, crafting efficient smart contract logic and an intuitive interface could be your edge. This case proves execution still beats hype.


Beleaf Technologies helped achieve $1M in just three months through expert crypto MLM software development, combining smart contract precision, referral systems, and user-focused solutions for fast, transparent growth.

Know more : https://beleaftechnologies.com/cryptocurrency-mlm-software-development

Whatsapp: +91 7904323274
Telegram: @BeleafSoftTech
Mail to: mailto:business@beleaftechnologies.com
--Run Code 1--

import requests
import json

# Define the URL to scrape and the API credentials
url = 'https://www.xing.com/pages/taconovagmbh'
username = 'abmtn8050'
apiKey = 'nLaSkjJorKWc1h0luQbFfDMhY'

# Set up the API URL for the scraping bot
apiUrl = "http://api.scraping-bot.io/scrape/raw-html"

# Prepare the payload for the POST request
payload = json.dumps({"url": url})
headers = {
    'Content-Type': "application/json"
}

# Send the request to the scraping bot API
response = requests.post(apiUrl, data=payload, auth=(username, apiKey), headers=headers)

# Check if the request was successful
response.raise_for_status()

# Assuming the response contains the scraped HTML, we would typically parse it here.
# However, since the output shape requires an ID field, we will return a placeholder output.
output = [{'id': '1', 'content': response.text}]  # Placeholder for actual content extraction

--Run Code 2 --  input html--

import re
from datetime import datetime

# Assume html is provided by input_data
html = input_data.get('html', '')

# Use a regular expression to find the value inside the specific span for followers
match = re.search(r'<span class="entity-infostyles__EntityInfoBlockValue-dyptuz-3.*?>(\d+)</span>', html)

# Extract the followers count
if match:
    followers_value = match.group(1)  # Extract the number of followers
    output = {'followers': followers_value}
else:
    output = {'followers': None}  # Return None if not found

# Extract and process the title from the HTML
title_match = re.search(r'<title[^>]*>(.*?)<\/title>', html)
if title_match:
    title = title_match.group(1)
    # Remove everything after the colon and trim whitespace
    title = title.split(':')[0].strip()
    output['pageTitle'] = title
else:
    output['pageTitle'] = ''

# Add the execution date and time to the output
output['executionDate'] = datetime.now().strftime('%Y-%m-%d %H:%M:%S')

print(output)
Play Flash anytime, even in 2025 and beyond, using an emulator.

Want to fix "This plug-in isn't supported" and "Adobe Flash Player is blocked" messages? This extension will remove those messages and allow you to play Flash in any website with a single click.

It's easy to use: Click once and that's it! The extension does all the work for you. No need to download extra apps, use old Flash versions, or be without your favorite games.

Play games, videos, and other Flash content on any website, including Armor Games, New York Times, Internet Archive, and more.

Also play local Flash files and direct SWF URLs with Premium.

This Flash Player extension will work in 2025, 2026, and beyond.

Compatibility Note: The emulator has limited support for ActionScript 3 and may not work with all Flash content. Please see https://ruffle.rs/#compatibility for more info on compatibility. Please contact support@modernkit.one if you have issues or feedback.

Some users may have a limited number of free plays per month without a subscription.

----

This extension uses the Ruffle emulator: https://ruffle.rs/
Ruffle is used under the MIT license: https://github.com/ruffle-rs/ruffle/blob/master/LICENSE.md
Adobe Flash Player is a trademark of Adobe, Inc.
/**
 * 
 * 
 * PRIORITIES : 
 * who comes first ? 
 * 6 + 1  
 * Promo >= 49
 * 
 *  
 */

// // Function to check if user has allowed role
function belair_user_has_allowed_role() {
    $allowed_roles = array( 'pro' );
    $user = wp_get_current_user();

    if ( array_intersect( $allowed_roles, $user->roles ) ) {
        return true;
    }

    return false;
}

function belair_calculate_cart_total_excluding_tax() {
    $cart_total = 0;

    // Get cart contents
    foreach ( WC()->cart->get_cart() as $cart_item_key => $cart_item ) {
        $product = $cart_item['data'];
        $price_excluding_tax = wc_get_price_excluding_tax( $product );

        // Calculate total excluding tax
        $cart_total += $price_excluding_tax * $cart_item['quantity'];
    }

    return $cart_total;
}
 

function belair_check_for_wholesale_prices_in_cart() {
	$cart = WC()->cart->get_cart();
	$found = false;

	foreach ( $cart as $cart_item_key => $cart_item ) {
		
// 		echo "<pre>";
// 		print_r($cart_item['wwp_data']);
// 		echo "</pre>";
		
		if ( isset( $cart_item['wwp_data'] ) && is_array( $cart_item['wwp_data'] ) ) {
			if ( $cart_item['wwp_data']['wholesale_priced'] === 'yes' ) {
				// The cart item contains the specified array
				$found = true;
				break; // Stop the loop since we found a matching item
			}
		}
	}
	
	return $found;
}


// Hook to apply or remove promo code based on total and user role
function belair_apply_or_remove_promo_code_based_on_total_and_user_role() { 
	// Check if user has allowed role
	if ( belair_user_has_allowed_role() ) {
		
		$cart_total = belair_calculate_cart_total_excluding_tax();
		$promo_code = '49ht30'; // use lowercase letters
		$coupon_applied = in_array(  $promo_code, WC()->cart->get_applied_coupons() );
		$tarifs_revendeurs_applied =  belair_check_for_wholesale_prices_in_cart();
		
// 		error_log(print_r([
// 			'cart_total' => $cart_total,
// 			'coupon_applied' => $coupon_applied,
// 			'tarifs_revendeurs_applied' => $tarifs_revendeurs_applied,
// 		]));
		
		// Prevent recursive recalculations
		remove_action('woocommerce_before_calculate_totals', 'belair_apply_or_remove_promo_code_based_on_total_and_user_role', 999);

	 
		// Adjust promo code based on cart total
		if ( $cart_total >= 49 && !$tarifs_revendeurs_applied )  {
			
// 			error_log("We are in coupon territory");
			
			if(!$coupon_applied){
				
			// Apply promo code
			WC()->cart->apply_coupon( $promo_code );
				$message = "Vous avez atteint 49€ HT de commande, votre réduction de 30% a été appliquée.";
				wc_add_notice($message, 'success');
			}
			
		} elseif ( $cart_total < 49 && $coupon_applied ) {
			// Remove promo code
			WC()->cart->remove_coupon( $promo_code );
			
// 			wc_add_notice("Below coupon territory", 'success');
			
			$message = "Votre panier n'atteint pas les 49€ HT pour bénéficier de -30%; veuillez ajouter des produits à votre commande.";
			wc_add_notice($message, 'success');
		} elseif ( $tarifs_revendeurs_applied ){
			
			 
// 			wc_add_notice("Above coupon territory", 'success');
			
			// Remove coupon as now Wholesaleprices apply and we don't want both at once
			WC()->cart->remove_coupon( $promo_code );
		
		}
		
		
	}

    // Reattach hook
    add_action('woocommerce_before_calculate_totals', 'belair_apply_or_remove_promo_code_based_on_total_and_user_role', 999);

}

// Hook into WooCommerce actions
add_action( 'woocommerce_before_calculate_totals', 'belair_apply_or_remove_promo_code_based_on_total_and_user_role', 999 );

// add_action( 'woocommerce_before_calculate_totals', 'belair_apply_or_remove_promo_code_based_on_total_and_user_role', 999 );




add_filter( 'woocommerce_package_rates', 'conditionally_show_shipping_method_based_on_coupon', 10, 2 );
function conditionally_show_shipping_method_based_on_coupon( $rates, $package ) {
    $required_coupon = '49ht30'; // Change this to your coupon code
    $targeted_shipping_method_id = 'service_point_shipping_method:52'; // Adjust this to match the method you want to restrict

    // Check if the required coupon is applied
    if ( ! in_array( strtolower( $required_coupon ), WC()->cart->get_applied_coupons() ) ) {
        foreach ( $rates as $rate_id => $rate ) {
            if ( $rate_id === $targeted_shipping_method_id ) {
                unset( $rates[ $rate_id ] );
            }
        }
    }

    return $rates;
}
Week-6
import pandas as pd from sklearn.preprocessing import LabelEncoder from sklearn.tree import DecisionTreeClassifier, plot_tree import matplotlib.pyplot as plt pandas: Used for handling and manipulating the dataset in DataFrame format. LabelEncoder: A class from sklearn.preprocessing used to convert categorical values (like "Sunny", "Rainy", etc.) into numeric values because machine learning algorithms require numeric input. DecisionTreeClassifier: A class from sklearn.tree that implements the Decision Tree algorithm for classification. plot_tree: A function from sklearn.tree used to visualize the trained decision tree. matplotlib.pyplot: Used for plotting and visualizing the tree.
import pandas as pd
from sklearn.preprocessing import LabelEncoder
from sklearn.tree import DecisionTreeClassifier, plot_tree
import matplotlib.pyplot as plt

# Sample "Play Tennis" dataset
data = {
    'Outlook': ['Sunny', 'Sunny', 'Overcast', 'Rainy', 'Rainy', 'Rainy',
                'Overcast', 'Sunny', 'Sunny', 'Rainy', 'Sunny', 'Overcast',
                'Overcast', 'Rainy'],
    'Temperature': ['Hot', 'Hot', 'Hot', 'Mild', 'Cool', 'Cool',
                    'Cool', 'Mild', 'Cool', 'Mild', 'Mild', 'Mild',
                    'Hot', 'Mild'],
    'Humidity': ['High', 'High', 'High', 'High', 'Normal', 'Normal',
                 'Normal', 'High', 'Normal', 'Normal', 'Normal', 'High',
                 'Normal', 'High'],
    'Wind': ['Weak', 'Strong', 'Weak', 'Weak', 'Weak', 'Strong',
             'Strong', 'Weak', 'Weak', 'Weak', 'Strong', 'Strong',
             'Weak', 'Strong'],
    'Play': ['No', 'No', 'Yes', 'Yes', 'Yes', 'No',
             'Yes', 'No', 'Yes', 'Yes', 'Yes', 'Yes',
             'Yes', 'No']
}

# Load data into DataFrame
df = pd.DataFrame(data)

# Encode categorical variables using  Label Encoder
label_encoders = {}
for column in df.columns:
    le = LabelEncoder()
    df[column] = le.fit_transform(df[column])
    label_encoders[column] = le

# Separate features and target
X = df.drop('Play', axis=1)
y = df['Play']

# Train Decision Tree model
dt = DecisionTreeClassifier(max_depth=3, random_state=1)
dt.fit(X, y)

# Plot the Decision Tree
plt.figure(figsize=(12, 6))
plot_tree(
    dt,
    feature_names=X.columns,
    class_names=label_encoders['Play'].classes_,  # ['No', 'Yes']
    filled=True,
    rounded=True
)
plt.title("Decision Tree - Play Tennis")
plt.tight_layout()
plt.show()
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LinearRegression
from sklearn.metrics import mean_squared_error, r2_score

# Sample dataset: you can replace this with your own dataset
# For this example, we're assuming a dataset with 3 features
data = {
    'Feature1': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],
    'Feature2': [1, 2, 1, 3, 5, 6, 4, 8, 9, 10],
    'Feature3': [5, 7, 6, 8, 6, 7, 8, 9, 10, 11],
    'Target': [5, 7, 6, 9, 11, 13, 14, 17, 19, 21]
}

# Convert to DataFrame
df = pd.DataFrame(data)

# Features (X) and Target variable (y)
X = df[['Feature1', 'Feature2', 'Feature3']]
y = df['Target']

# Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Create the model
model = LinearRegression()

# Train the model
model.fit(X_train, y_train)

# Make predictions
y_pred = model.predict(X_test)

# Evaluate the model
mse = mean_squared_error(y_test, y_pred)
r2 = r2_score(y_test, y_pred)

print(f"Mean Squared Error: {mse}")
print(f"R-squared: {r2}")

# Output the coefficients and intercept
print(f"Coefficients: {model.coef_}")
print(f"Intercept: {model.intercept_}")

# Visualizing the results (optional, for 3D)
from mpl_toolkits.mplot3d import Axes3D

# Just a 3D plot for 3 features
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')

# Scatter plot for actual data
ax.scatter(X_test['Feature1'], X_test['Feature2'], y_test, color='blue', label='Actual data')

# Scatter plot for predicted data
ax.scatter(X_test['Feature1'], X_test['Feature2'], y_pred, color='red', label='Predicted data')

ax.set_xlabel('Feature1')
ax.set_ylabel('Feature2')
ax.set_zlabel('Target')
plt.legend()
plt.show()
week-8:
# Import necessary libraries
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import accuracy_score, classification_report

# Load a sample dataset (Iris)
data = load_iris()
X = data.data        # Features
y = data.target      # Labels

# For binary classification, select only two classes (e.g., class 0 and 1)
X = X[y != 2]
y = y[y != 2]

# Split into training and testing data
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the Logistic Regression model
model = LogisticRegression()

# Train the model
model.fit(X_train, y_train)

# Predict on the test data
y_pred = model.predict(X_test)

# Evaluate the model
print("Accuracy:", accuracy_score(y_test, y_pred))
print("Classification Report:\n", classification_report(y_test, y_pred))	

output:
Accuracy: 1.0
Classification Report:
            precision    recall  f1-score   support   
0       1.00      1.00      1.00        12
           1       1.00      1.00      1.00         8

    accuracy                           1.00        20
   macro avg       1.00      1.00      1.00        20
weighted avg       1.00      1.00      1.00        20
Week -7 
Implementation of KNN USING SKlinear:
The K-Nearest Neighbors (KNN) algorithm is a simple, versatile machine learning method used for both classification and regression tasks. It makes predictions by finding the "k" closest data points (neighbors) to a new data point in a feature space and using their labels or values to make a prediction for the new point. 
# Import necessary libraries
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.neighbors import KNeighborsClassifier
from sklearn.metrics import accuracy_score, classification_report

# Load a sample dataset (Iris)
data = load_iris()
X = data.data        # Features
y = data.target      # Labels

# Split into train and test sets (80% train, 20% test)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Initialize the KNN classifier with k=3
knn = KNeighborsClassifier(n_neighbors=3)

# Train the model
knn.fit(X_train, y_train)

# Predict on test data
y_pred = knn.predict(X_test)

# Evaluate the model
print("Accuracy:", accuracy_score(y_test, y_pred))
print("Classification Report:\n", classification_report(y_test, y_pred))

output:

Accuracy: 1.0
Classification Report:
               precision    recall  f1-score   support

           0       1.00      1.00      1.00        10
           1       1.00      1.00      1.00         9
           2       1.00      1.00      1.00        11

    accuracy                           1.00        30
   macro avg       1.00      1.00      1.00        30
weighted avg       1.00      1.00      1.00        30
function validNumber(...anyAmountOfArgs) {
    return anyAmountOfArgs.every((val) => Number.isFinite(val) && val > 0);
}

// Validate distance and duration always
// using 2 params
if (!validNumber(distance, duration)) {
  alert("Please enter positive numbers for distance and duration.");
  return;
}

// Validate cadence if running
// using 1 param
if (type === "running" && !validNumber(cadence)) {
  alert("Please enter a positive number for cadence.");
  return;
}
/* html*/ 
 <div class="form__row">
            <label class="form__label">Cadence</label>
            <input
              class="form__input form__input--cadence"
              placeholder="step/min"
            />
          </div>
          <div class="form__row form__row--hidden">
            <label class="form__label">Elev Gain</label>
            <input
              class="form__input form__input--elevation"
              placeholder="meters"
            />
          </div>




/* Map JS*/
inputType.addEventListener("change", (e) => updateSelect(e));

function updateSelect(e) {
    const { target } = e;

    const value = target.value;

    const cadenceRow = inputCadence.closest(".form__row");
    const elevationRow = inputElevation.closest(".form__row");

    // Remove the hidden class from both rows first
    cadenceRow.classList.remove("form__row--hidden");
    elevationRow.classList.remove("form__row--hidden");

    const selected = {
      cycling: elevationRow,
      running: cadenceRow,
    };

    selected[value].classList.add("form__row--hidden");
  }



/* with a reusable function */


const inputType = document.querySelector('.form__input--type');

inputType.addEventListener("change", function (e) {
  const value = e.target.value;

  // run the toggleFieldVisibility here
  toggleFieldVisibility(value, {
    running: '.form__input--elevation',
    cycling: '.form__input--cadence',
  });
});

function toggleFieldVisibility(selectedType, hiddenFieldMap) {
  // First remove hidden class from all mapped fields
  Object.values(hiddenFieldMap).forEach(selector => {
    const row = document.querySelector(selector)?.closest('.form__row');
    row?.classList.remove('form__row--hidden');
  });

  // Then hide the one mapped to the selected type
  const selectorToHide = hiddenFieldMap[selectedType];
  const rowToHide = document.querySelector(selectorToHide)?.closest('.form__row');
  rowToHide?.classList.add('form__row--hidden');
}
def fibonacci_series(n):
    fib_sequence = [0, 1]  # Starting values for Fibonacci series

    while len(fib_sequence) < n:
        # Add the last two numbers to get the next one
        next_number = fib_sequence[-1] + fib_sequence[-2]
        fib_sequence.append(next_number)

    return fib_sequence[:n]  # Return only first 'n' elements

# Example usage
num_terms = int(input("Enter the number of terms: "))
print("Fibonacci Series:")
print(fibonacci_series(num_terms))
firstelem = List();
eachlist = List();
//
visual = Visual_Inspection[Door_ID_online = input.Door_ID_online].ID.getAll();
sizz = visual.size();
currentpos = (sizz - 1) % 4 + 1;
//
mapp = Map();
mapp = {1:0,2:1,3:2,4:3};
//fetch correct 1st visual inspection in current batch
fet = mapp.get(currentpos.toLong());
//
for each  viz in Visual_Inspection[Door_ID_online == input.Door_ID_online && archived = false] sort by Date_of_inspection desc
{
	eachlist.add(viz.ID);
}
if(eachlist.size() > 0)
{
	firstelem.add(eachlist.get(fet));
}
 -- DROP TABLE team_kingkong.onus_Fastag_TrustedUser_CCDC_Weekly_Monthly_limitCheck_breaches;

-- CREATE TABLE team_kingkong.onus_Fastag_TrustedUser_CCDC_Weekly_Monthly_limitCheck_breaches AS
INSERT INTO team_kingkong.onus_Fastag_TrustedUser_CCDC_Weekly_Monthly_limitCheck_breaches
with onus_txn_base as
    (SELECT DISTINCT A.*, case when m1.mid is not null then category else 'Others' end as business_category FROM 
        (select userid, transactionid,
        cast(eventAmount as double) / 100 as amt,
        dateinserted,
        substr(cast(dateinserted as varchar(30)), 1, 7) as mnth,
        paymethod, paytmmerchantid, responsestatus, actionrecommended, velocitytimestamp
        FROM cdp_risk_transform.maquette_flattened_onus_snapshot_v3
        WHERE DATE(dl_last_updated) BETWEEN DATE(DATE'2025-04-01' - INTERVAL '30' DAY) AND DATE'2025-04-30'
        AND SOURCE = 'PG'
        AND responsestatus IN ('SUCCESS') AND actionrecommended = 'PASS'
        AND paytmmerchantid IN ('PTMFVT32998068120662') AND paymethod IN ('DEBIT_CARD', 'CREDIT_CARD')
        AND eventid IN (SELECT eventlinkid
        FROM risk_maquette_data_async.pplus_payment_result_prod_async_snapshot_v3
        WHERE dl_last_updated BETWEEN DATE(DATE'2025-04-01' - INTERVAL '30' DAY) AND DATE'2025-04-30')) a
    left join
        (select * from team_kingkong.voc_mid_categorization where mid != '') m1
    on a.paytmmerchantid = m1.mid)
 
SELECT * FROM 
    (SELECT A.*
    -- No.of attempted txns last 7 days > 20 (consider only the CCBP transactions)
    , SUM(IF(DATE(B.dateinserted) BETWEEN DATE(DATE(A.dateinserted) - INTERVAL '7' DAY) AND DATE(A.dateinserted), B.amt, NULL)) AS week_amt
    , 25295 AS week_threshold
    -- No.of attempted txns per calendar month > 30 (consider only the CCBP transactions)
    , SUM(IF(DATE(B.dateinserted) BETWEEN date_trunc('month', DATE(A.dateinserted)) AND DATE(A.dateinserted), B.amt, NULL)) AS month_amt
    , 50590 AS month_threshold
    FROM
        (SELECT * FROM onus_txn_base
        WHERE DATE(dateinserted) BETWEEN DATE'2025-04-01' AND DATE'2025-04-30'
        )A
    INNER JOIN
        (SELECT * FROM onus_txn_base)B
    ON A.userid = B.userid AND A.transactionid <> B.transactionid AND B.velocitytimestamp < A.velocitytimestamp
    AND DATE(B.dateinserted) BETWEEN DATE(A.dateinserted - INTERVAL '30' DAY) AND DATE(A.dateinserted)
    GROUP BY 1,2,3,4,5,6,7,8,9,10,11)
WHERE ((amt + week_amt) >= week_threshold) OR ((amt + month_amt) >= month_threshold)
;
{
	"blocks": [
		{
			"type": "header",
			"text": {
				"type": "plain_text",
				"text": ":x-connect: Xero Boost Days! :x-connect:"
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "Good morning Brisbane! Please see below for what's on this week."
			}
		},
		{
			"type": "divider"
		},
		{
			"type": "header",
			"text": {
				"type": "plain_text",
				"text": ":calendar-date-4: Monday, 2nd June",
				"emoji": true
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "\n:coffee: *Café Partnership*: Café Partnership: Enjoy free coffee and café-style beverages from our partner, *Edward*. \n\n :lunch: *Lunch*: from *12pm* in the kitchen."
			}
		},
		{
			"type": "header",
			"text": {
				"type": "plain_text",
				"text": ":calendar-date-5: Wednesday, 4th June",
				"emoji": true
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": ":coffee: *Café Partnership*: Café Partnership: Enjoy coffee and café-style beverages from our partner, *Edward*. \n\n :late-cake: *Morning Tea*: from *10am* in the kitchen."
			}
		},
		{
			"type": "divider"
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "*LATER THIS MONTH:*"
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "*13th June* *&* *27th June*\n :blob-party: *Social Happy Hour*: Wind down over some drinks & nibbles with your work pals!"
			}
		},
		{
			"type": "divider"
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "Stay tuned to this channel for more details, check out the <https://calendar.google.com/calendar/u/0?cid=Y19uY2M4cDN1NDRsdTdhczE0MDhvYjZhNnRjb0Bncm91cC5jYWxlbmRhci5nb29nbGUuY29t|*Brisbane Social Calendar*>, and get ready to Boost your workdays!\n\nLove,\nWX Team :party-wx:"
			}
		}
	]
}
{
	"blocks": [
		{
			"type": "header",
			"text": {
				"type": "plain_text",
				"text": ":x-connect: Xero Boost Days! :x-connect:"
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "Good morning Sydney! Please see below for what's on this week."
			}
		},
		{
			"type": "divider"
		},
		{
			"type": "header",
			"text": {
				"type": "plain_text",
				"text": ":calendar-date-4: Wednesday, 4th June",
				"emoji": true
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "\n:coffee: *Café Partnership*: Café Partnership: Enjoy free coffee and café-style beverages from our partner, *Naked Duck* \n\n :breakfast: *Breakfast*: from *9am* in the All Hands Space."
			}
		},
		{
			"type": "header",
			"text": {
				"type": "plain_text",
				"text": ":calendar-date-5: Thursday, 5th June",
				"emoji": true
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": ":coffee: *Café Partnership*: Café Partnership: Enjoy coffee and café-style beverages from our partner, *Naked Duck*\n\n :late-cake: *Lunch*: from *12:30pm* in the All Hands Space."
			}
		},
		{
			"type": "divider"
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "*LATER THIS MONTH:*"
			}
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "*Thursday, 12th June *\n :blob-party: *Social Happy Hour*: Wind down over some drinks & nibbles with your work pals!"
			}
		},
		{
			"type": "divider"
		},
		{
			"type": "section",
			"text": {
				"type": "mrkdwn",
				"text": "Stay tuned to this channel for more details, check out the <https://calendar.google.com/calendar/u/0/r?cid=Y185aW90ZWV0cXBiMGZwMnJ0YmtrOXM2cGFiZ0Bncm91cC5jYWxlbmRhci5nb29nbGUuY29t|*Sydney Social Calendar*>, and get ready to Boost your workdays!\n\nLove,\nWX Team :party-wx:"
			}
		}
	]
}
select mid,month(dateinserted) as Month,
sum(txn_amount) failedGMV
from (
select
transactionid,
max(dateinserted) dateinserted ,
max(cast(eventamount as double)/100) txn_amount            ,
max(paytmmerchantid) mid 
from
cdp_risk_transform.maquette_flattened_offus_snapshot_v3
where 
dl_last_updated >= date'2024-10-01'
and DATE (dateinserted)  between date '2024-10-01' and date '2025-01-31'
AND actionrecommended = 'BLOCK'
group by 1)
group by 1,2
 
select pg.mid,
pg.Month,
CASE
  WHEN m_3.o_mid IS NOT NULL THEN 'Online'
  WHEN m_4.e_mid IS NOT NULL THEN 'EDC'
  ELSE 'QR'
 END AS EDC_QR,
  m_7.channel as subBusiness,
 mcc.mcc as mccCode,
 pg.category,
 pg.subCategory,
 sum(pg.attemptedGMV)as attemptedGMV,
 sum(pg.seccessfulGMV) asseccessfulGMV ,
 sum(pg.refundGMV) refundGMV,
sum(rej.rejectedGMV)rejectedGMV ,
sum(pg.failedGMV) failedGMV,
sum(frd.fraudGMV) fraudGMV,
sum(cb.cbGMV) cbGMV
 from
(SELECT mid, month(txn_started_at) as Month,category,sub_category subCategory,
       sum(txn_amount) as attemptedGMV, sum(Case when txn_status = 'SUCCESS'  then txn_amount else 0 end) as seccessfulGMV,
       sum(Case when txn_status <> 'SUCCESS'  then txn_amount else 0 end) as failedGMV,
       sum(case when txn_status = 'SUCCESS' and refund_amount is not null and refund_amount > 0 then txn_amount end) as refundGMV
                FROM
                dwh.pg_olap
                WHERE category NOT IN ('OnPaytm', 'Test')
                AND mid IS NOT NULL
                AND txn_amount > 0
                AND txn_id IS NOT NULL
                AND ingest_date >=  date '2024-10-01'
                AND DATE (substr(cast(txn_started_at AS VARCHAR), 1, 10))  between date '2024-10-01' and date '2025-02-28'
group by 1,2,3,4) as pg

-- Mapping QR/EDC mids
LEFT JOIN (
  SELECT DISTINCT merchant_id AS o_mid
  FROM datalake.online_payment_merchants
 ) AS m_3 
 ON pg.mid = m_3.o_mid
 LEFT JOIN (
  SELECT DISTINCT
   mid AS e_mid
  FROM
   paytmpgdb.entity_edc_info_snapshot_v3
  WHERE
   terminal_status = 'ACTIVE'
   AND dl_last_updated >= DATE '2010-01-01'
 ) m_4 ON pg.mid = m_4.e_mid

 -- Mapping Channel here
  LEFT Join (
  select distinct
   pg_mid,
   channel
  from
   cdo.total_offline_merchant_base_snapshot_v3
 ) AS m_7 ON pg.mid = m_7.pg_mid
 left join
 (SELECT  distinct upi_mcc_code mcc,
          category ,
          sub_category  
          FROM paytmpgdb.mcc_code_mapping_snapshot_v3) mcc 
          on pg.category = mcc.category and pg.subCategory = mcc.sub_category

-- Mapping rejected GMV
left join 
(select mid,month(dateinserted) as Month,
sum(txn_amount) rejectedGMV
from (
select
transactionid,
max(dateinserted) dateinserted ,
max(cast(eventamount as double)/100) txn_amount            ,
max(paytmmerchantid) mid 
from
cdp_risk_transform.maquette_flattened_offus_snapshot_v3
where 
dl_last_updated >= date'2024-10-01'
and DATE (dateinserted)  between date '2024-10-01' and date '2025-02-28'
AND actionrecommended = 'BLOCK'
group by 1)
group by 1,2) as rej ON pg.mid = rej.mid and pg.Month=rej.Month

-- Mapping fraudGMV
left join
(select mid,month(txn_date) as Month,
sum(txn_amount) fraudGMV
from
        (select
          old_pg_txn_id as txn_id,
          min(cast(old_pg_txn_amount as double)) txn_amount,
          min(date(old_pg_txn_started_at)) txn_date,
          min(old_pg_ingest_date) old_pg_ingest_date,
          min(old_pg_mid) mid
        from
          frauds.fraud_combined_snapshot_v3
        where
        dl_last_updated >= date'2024-10-01'
        and DATE (old_pg_txn_started_at)  between date '2024-10-01' and date '2025-02-28'
          and table_name in (
            'ppsl_cybercell',
            'ro_panel_cybmerchant_details_with_pg_olap',
            'lending_fraud',
            'efrm',
            'ppsl_bank_escalations',
            'ro_panel_minifmr_l2_PPI',
            'ro_panel_minifmr_l2_BNK'
          )
          and old_pg_category NOT IN ('OnPaytm', 'Test')
          and old_pg_txn_status = 'SUCCESS'
          AND old_pg_mid IS NOT NULL
          AND cast(old_pg_txn_amount as double) > 0
          AND old_pg_txn_id IS NOT NULL
        group by 1)
        group by 1,2) as frd on pg.mid=frd.mid and pg.Month=frd.Month
-- Mapping chargeBack
left join (select mid,
       month(date(substr(CAST (old_pg_txn_started_at AS varchar), 1, 10))) Month ,
       count(transaction_id) cbCount,
       sum(cbGMV) cbGMV
from (
select 
transaction_id, 
max(old_pg_mid) mid,
sum(chargeback_amount) cbGMV,
max(old_pg_txn_started_at) old_pg_txn_started_at
from
 cdp_risk_transform.transaction_level_chargeback_snapshot_v3
WHERE old_pg_ingest_date >= date'2024-10-01'
  AND date(substr(CAST (old_pg_txn_started_at AS varchar), 1, 10)) 
  between date'2024-10-01' and date '2025-02-28'
group by 1)
group by 1,2) as cb on pg.mid=cb.mid and pg.Month=cb.Month
where pg.mid = 'NEHAEL54380289516455'
group  by 1,2,3,4,5,6,7;
star

Thu Jun 05 2025 12:04:35 GMT+0000 (Coordinated Universal Time) https://www.addustechnologies.com/p2p-crypto-exchange-software

@Seraphina

star

Thu Jun 05 2025 11:34:06 GMT+0000 (Coordinated Universal Time) https://wisewaytec.com/blockchain-development-company/

@snehawt15

star

Thu Jun 05 2025 10:46:27 GMT+0000 (Coordinated Universal Time) https://appticz.com/ios-app-development-company

@aditi_sharma_

star

Thu Jun 05 2025 10:02:48 GMT+0000 (Coordinated Universal Time)

@dannygelf #salesforce #permissions #soql

star

Thu Jun 05 2025 09:21:49 GMT+0000 (Coordinated Universal Time)

@Shivam3.tyagi

star

Thu Jun 05 2025 08:02:15 GMT+0000 (Coordinated Universal Time)

@davidmchale #mapping #select

star

Thu Jun 05 2025 07:41:07 GMT+0000 (Coordinated Universal Time)

@shubhangi.b

star

Thu Jun 05 2025 07:40:25 GMT+0000 (Coordinated Universal Time)

@shubhangi.b

star

Thu Jun 05 2025 06:02:06 GMT+0000 (Coordinated Universal Time)

@Pulak

star

Thu Jun 05 2025 05:29:42 GMT+0000 (Coordinated Universal Time)

@shubhangi.b

star

Wed Jun 04 2025 22:14:56 GMT+0000 (Coordinated Universal Time)

@vjg #python

star

Wed Jun 04 2025 10:58:48 GMT+0000 (Coordinated Universal Time)

@chitss2610

star

Wed Jun 04 2025 10:52:18 GMT+0000 (Coordinated Universal Time)

@chitss2610

star

Wed Jun 04 2025 10:51:22 GMT+0000 (Coordinated Universal Time)

@chitss2610

star

Wed Jun 04 2025 10:16:59 GMT+0000 (Coordinated Universal Time)

@MinaTimo

star

Wed Jun 04 2025 09:25:04 GMT+0000 (Coordinated Universal Time)

@shubhangi.b

star

Wed Jun 04 2025 04:33:00 GMT+0000 (Coordinated Universal Time)

@cvanwert #python

star

Tue Jun 03 2025 19:18:21 GMT+0000 (Coordinated Universal Time)

@freepythoncode ##python #coding #python

star

Tue Jun 03 2025 09:27:21 GMT+0000 (Coordinated Universal Time) https://cryptocurrency-exchange-development-company.com/

@raydensmith

star

Tue Jun 03 2025 09:21:16 GMT+0000 (Coordinated Universal Time)

@shubhangi.b

star

Tue Jun 03 2025 09:15:34 GMT+0000 (Coordinated Universal Time) https://www.coinsclone.com/how-to-create-a-crypto-wallet/

@CharleenStewar #builda crypto wallet app

star

Mon Jun 02 2025 12:26:56 GMT+0000 (Coordinated Universal Time) https://www.beleaftechnologies.com/bc-game-clone-script

@raydensmith #bc #game #clone #script

star

Mon Jun 02 2025 09:34:37 GMT+0000 (Coordinated Universal Time) https://beleaftechnologies.com/cryptocurrency-mlm-software-development

@stvejhon #crypto #cryptocurrency #exchange #meme

star

Mon Jun 02 2025 08:28:15 GMT+0000 (Coordinated Universal Time)

@abm #zapier #taconova #phython #scraping

star

Mon Jun 02 2025 06:03:51 GMT+0000 (Coordinated Universal Time) https://medium.com/nerd-for-tech/binance-clone-script-quick-way-to-launch-a-cryptocurrency-exchange-d64c95754703

@janetbrownjb #cryptoexchangedevelopment #cryptocurrencybusiness #launchwithbinanceclonescript #binanceclonescriptsolutions #cryptostartupideas

star

Mon Jun 02 2025 02:15:38 GMT+0000 (Coordinated Universal Time) https://microsoftedge.microsoft.com/addons/detail/flash-player-2025/fgenmmklgkdemhpgdppmldmkemplbcko

@Asneedarazali

star

Mon Jun 02 2025 01:59:13 GMT+0000 (Coordinated Universal Time) https://docs.github.com/en/copilot/managing-copilot/configure-personal-settings/installing-github-copilot-in-the-cli

@Asneedarazali

star

Mon Jun 02 2025 01:59:06 GMT+0000 (Coordinated Universal Time) https://docs.github.com/en/copilot/managing-copilot/configure-personal-settings/installing-github-copilot-in-the-cli

@Asneedarazali

star

Mon Jun 02 2025 01:58:49 GMT+0000 (Coordinated Universal Time) https://docs.github.com/en/copilot/managing-copilot/configure-personal-settings/installing-github-copilot-in-the-cli

@Asneedarazali

star

Mon Jun 02 2025 01:58:42 GMT+0000 (Coordinated Universal Time) https://docs.github.com/en/copilot/managing-copilot/configure-personal-settings/installing-github-copilot-in-the-cli

@Asneedarazali

star

Mon Jun 02 2025 01:58:37 GMT+0000 (Coordinated Universal Time) https://docs.github.com/en/copilot/managing-copilot/configure-personal-settings/installing-github-copilot-in-the-cli

@Asneedarazali

star

Mon Jun 02 2025 01:58:29 GMT+0000 (Coordinated Universal Time) https://docs.github.com/en/copilot/managing-copilot/configure-personal-settings/installing-github-copilot-in-the-cli

@Asneedarazali

star

Sun Jun 01 2025 14:50:13 GMT+0000 (Coordinated Universal Time)

@nani ##python

star

Sun Jun 01 2025 14:47:35 GMT+0000 (Coordinated Universal Time)

@nani ##python

star

Sun Jun 01 2025 14:43:00 GMT+0000 (Coordinated Universal Time)

@nani ##python

star

Sun Jun 01 2025 14:41:15 GMT+0000 (Coordinated Universal Time)

@nani ##python

star

Sun Jun 01 2025 03:14:09 GMT+0000 (Coordinated Universal Time)

@davidmchale #rest #operator #...args

star

Sun Jun 01 2025 00:33:48 GMT+0000 (Coordinated Universal Time)

@davidmchale #maps #toggling #decision

star

Sat May 31 2025 23:37:07 GMT+0000 (Coordinated Universal Time) https://alienshanu.me/

@blackalien

star

Sat May 31 2025 21:03:41 GMT+0000 (Coordinated Universal Time) https://rustme.net/cabinet

@Kifircheck

star

Sat May 31 2025 10:20:01 GMT+0000 (Coordinated Universal Time) https://www.beleaftechnologies.com/bc-game-clone-script

@raydensmith #bc #game #bcgameclonescript

star

Sat May 31 2025 05:10:23 GMT+0000 (Coordinated Universal Time) https://freeytthumbnail.com/

@FreeYt #python

star

Fri May 30 2025 13:29:58 GMT+0000 (Coordinated Universal Time)

@moussa_29

star

Fri May 30 2025 07:51:50 GMT+0000 (Coordinated Universal Time) https://cryptocurrency-exchange-development-company.com/

@raydensmith #cryptoexchange

star

Fri May 30 2025 05:46:04 GMT+0000 (Coordinated Universal Time)

@shubhangi.b

star

Fri May 30 2025 02:01:00 GMT+0000 (Coordinated Universal Time)

@FOHWellington

star

Fri May 30 2025 01:56:28 GMT+0000 (Coordinated Universal Time)

@FOHWellington

star

Thu May 29 2025 11:03:28 GMT+0000 (Coordinated Universal Time)

@Shivam3.tyagi

star

Thu May 29 2025 10:58:44 GMT+0000 (Coordinated Universal Time)

@Shivam3.tyagi

Save snippets that work with our extensions

Available in the Chrome Web Store Get Firefox Add-on Get VS Code extension