Sponsored Content
Full Discussion: Decoding a string
Top Forums UNIX for Dummies Questions & Answers Decoding a string Post 302608941 by balajesuri on Monday 19th of March 2012 03:38:08 AM
Old 03-19-2012
Your code takes 0.02s on my machine!! I'm working on a server having 6 quad core cpu's each clocking 2.4 GHz and running on RHEL.

Try this. I don't claim it to be the most efficient. But still give it a try:

Code:
#! /bin/bash

len=$(($(echo $1 | wc -c) - 1))
a=$1
for ((i=0; i<=$len; i++))
do
    ch=${a:$i:1}
    if [[ "${ch}" =~ [0-9] ]]
    then
        times=$ch
    else
        for ((j=1; j<=$times; j++))
        do
            echo -e "$ch\c"
        done
    fi
done

Code:
# ./test.sh 3a4b3x3e
aaabbbbxxxeee

This User Gave Thanks to balajesuri For This Post:
 

10 More Discussions You Might Find Interesting

1. Programming

Decoding of Core Dump

Hi ALL, Is it possible to decode the core dumb file to find the error? I get an Memory Core Dumb error with an core file. Regards, P. Prathaban. (3 Replies)
Discussion started by: p_prathaban
3 Replies

2. Shell Programming and Scripting

decoding commands

hi please can anyone help me in decoding shell commands. i need a way to decode the encrypted shell commands. (8 Replies)
Discussion started by: rochitsharma
8 Replies

3. Solaris

problem when decoding a binary file

I tried to decode a binary script using the command 'uudecode'. but it is giving error as 'No begining line'. 'uudecode -o <outfile name> <binary file>' Please help me in resolving this. (4 Replies)
Discussion started by: vamshikrishnab
4 Replies

4. IP Networking

Packet decoding

Hi, wondering if anyone can suggest a tool to me that will let me either cut & paste hex or type it in for packet decoding. I want to be able to decode a packet as done with tcpdump or wireshark, but I want to be able to manually input the hex myself. (2 Replies)
Discussion started by: Breakology
2 Replies

5. UNIX for Dummies Questions & Answers

URL decoding with awk

The challenge: Decode URL's, i.e. convert %HEX to the corresponding special characters, using only UNIX base utilities, and without having to type out each special character. I have an anonymous C code snippet where the author assigns each hex digit a number from 0 to 16 and then does some... (2 Replies)
Discussion started by: uiop44
2 Replies

6. Shell Programming and Scripting

[Solved] Decoding a base 64 string

Is it possible to decode a base 64 string in linux or unix. If so, related commands or reference notes would be really helpful. (1 Reply)
Discussion started by: chandu123
1 Replies

7. Shell Programming and Scripting

FTP decoding

I am trying to understand a UNIX script which FTPs certain files from a remote location to the local machine. I understand the basic FTP command but the UNIX script uses the following command: ftp -n -i -v > $logftp_trg 2>&1 <<! open $MFX_FTP_SERVER user $MFX_FTP_LOGIN $MFX_FTP_PWD Can anyone... (5 Replies)
Discussion started by: Bhavesh Sharma
5 Replies

8. Shell Programming and Scripting

Decoding and pattern matching

Hello, I have a huge file with over 700,00 SNPs with 18 columns. One column is in the format --+-+ ---++ ????? -???? Now i have another list which corresponds to this code in a particular order A-1 B-7 C-11 D-3 E-100 Now I need to match the expression above to the pattern,... (1 Reply)
Discussion started by: nans
1 Replies

9. Programming

ASN1 decoding error

Hi, fellows i am modifying asn1 schema to be able to decode a file, but i am hitting a error on one of the fields using free online tool asn1-playground. I suspect i need to change type and have tried with IDENTIFIER but it doesn't help...any ideas check the schema and file down , please ... (0 Replies)
Discussion started by: tahchiev01
0 Replies

10. OS X (Apple)

Unicode encoding and decoding, OSX 10.13.5.

I am struggling here to understand...... The default encoding. See photo 1. Why does this NOT work? #!/bin/bash # Code for OSX 10.13.5. default UNICODE encoding. echo"" echo "The default UTF-8..." locale echo"" echo "Change to 8 bit ASCII only..." LANG="en_GB.US-ASCII" export... (2 Replies)
Discussion started by: wisecracker
2 Replies
KCPUSET(9)						   BSD Kernel Developer's Manual						KCPUSET(9)

NAME
kcpuset, kcpuset_create, kcpuset_destroy, kcpuset_copy, kcpuset_use, kcpuset_unuse, kcpuset_copyin, kcpuset_copyout, kcpuset_zero, kcpuset_fill, kcpuset_set, kcpuset_clear, kcpuset_isset, kcpuset_iszero, kcpuset_match, kcpuset_merge, kcpuset_atomic_set, kcpuset_atomic_clear -- dynamic kernel CPU sets SYNOPSIS
#include <sys/kcpuset.h> void kcpuset_create(kcpuset_t **retkcp, bool zero); void kcpuset_destroy(kcpuset_t *kcp); void kcpuset_copy(kcpuset_t *dkcp, kcpuset_t *skcp); void kcpuset_use(kcpuset_t *kcp); void kcpuset_unuse(kcpuset_t *kcp, kcpuset_t **lst); int kcpuset_copyin(const cpuset_t *ucp, kcpuset_t *kcp, size_t len); int kcpuset_copyout(kcpuset_t *kcp, cpuset_t *ucp, size_t len); void kcpuset_zero(kcpuset_t *kcp); void kcpuset_fill(kcpuset_t *kcp); void kcpuset_set(kcpuset_t *kcp, cpuid_t cpu); void kcpuset_clear(kcpuset_t *kcp, cpuid_t cpu); int kcpuset_isset(kcpuset_t * kcp, cpuid_t cpu); bool kcpuset_iszero(kcpuset_t *kcp); bool kcpuset_match(const kcpuset_t *kcp1, const kcpuset_t *kcp2); void kcpuset_merge(kcpuset_t *kcp1, kcpuset_t *kcp2); void kcpuset_atomic_set(kcpuset_t *kcp, cpuid_t cpu); void kcpuset_atomic_clear(kcpuset_t *kcp, cpuid_t cpu); DESCRIPTION
The machine-independent kcpuset subsystem provides support for dynamic processor sets. Conceptually kcpuset can be understood to be the ker- nel equivalent of the user space cpuset(3) interface. FUNCTIONS
kcpuset_create(retkcp, zero) The kcpuset_create() function creates a dynamic CPU set and stores the result to retkcp. If the boolean zero is not false, the allocated set is also initialized to zero. kcpuset_destroy(kcp) Destroys the CPU set kcp and schedules any linked CPU sets for deferred destruction. kcpuset_copy(dkcp, skcp) Copies the CPU set pointed by skcp to dkcp. kcpuset_use(kcp) Marks kcp as being in use by increasing the reference count of the object. Note that initially kcpuset_create() sets the reference count to 1. kcpuset_unuse(kcp, lst) Decreases the internal reference count of kcp, and on the last reference (when the count reaches zero), destroys kcp. If lst is not NULL, then instead of destroying, kcp will be added to the lst list for a deferred destruction. kcpuset_copyin(ucp, kcp, len) Copies the len bytes long user-space CPU set ucp to the kernel CPU set kcp. kcpuset_copyout(kcp, ucp, len) Copies the kernel CPU set kcp to the user-space CPU set ucp. kcpuset_zero(kcp) Clears the set kcp. kcpuset_fill(kcp) Fills the whole set kcp with ones. kcpuset_set(kcp, cpu) Adds cpu to the set kcp. kcpuset_clear(kcp, cpu) Removes cpu from the set kcp. kcpuset_isset(kcp, cpu) Returns 1 if cpu is part of the CPU set kcp. kcpuset_iszero(kcp) Returns true if the set kcp is empty. kcpuset_match(kcp1, kcp2) Compares the sets kcp1 and kcp2, returning true if these are identical. kcpuset_merge(kcp1, kcp2) Merges the set kcp2 to the set kcp1. kcpuset_atomic_set(kcp, cpu) The kcpuset_atomic_set() function operates as kcpuset_set(), but the operation is atomic; see atomic_ops(3) for more details. kcpuset_atomic_clear(kcp, cpu) Removes cpu from the CPU set kcp atomically. CODE REFERENCES
The kcpuset subsystem is implemented within sys/kern/subr_kcpuset.c. SEE ALSO
cpuset(3) HISTORY
The kcpuset subsystem first appeared in NetBSD 6.0. BSD
October 6, 2011 BSD
All times are GMT -4. The time now is 03:59 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy