Hi ALL,
Is it possible to decode the core dumb file to find the error?
I get an Memory Core Dumb error with an core file.
Regards,
P. Prathaban. (3 Replies)
I tried to decode a binary script using the command 'uudecode'. but it is giving error as 'No begining line'.
'uudecode -o <outfile name> <binary file>'
Please help me in resolving this. (4 Replies)
Hi, wondering if anyone can suggest a tool to me that will let me either cut & paste hex or type it in for packet decoding.
I want to be able to decode a packet as done with tcpdump or wireshark, but I want to be able to manually input the hex myself. (2 Replies)
The challenge:
Decode URL's, i.e. convert %HEX to the corresponding special characters, using only UNIX base utilities, and without having to type out each special character.
I have an anonymous C code snippet where the author assigns each hex digit a number from 0 to 16 and then does some... (2 Replies)
I am trying to understand a UNIX script which FTPs certain files from a remote location to the local machine. I understand the basic FTP command but the UNIX script uses the following command:
ftp -n -i -v > $logftp_trg 2>&1 <<!
open $MFX_FTP_SERVER
user $MFX_FTP_LOGIN $MFX_FTP_PWD
Can anyone... (5 Replies)
Hello,
I have a huge file with over 700,00 SNPs with 18 columns. One column is in the format
--+-+
---++
?????
-????
Now i have another list which corresponds to this code in a particular order
A-1
B-7
C-11
D-3
E-100
Now I need to match the expression above to the pattern,... (1 Reply)
Hi, fellows
i am modifying asn1 schema to be able to decode a file, but i am hitting a error on one of the fields using free online tool asn1-playground. I suspect i need to change type and have tried with IDENTIFIER but it doesn't help...any ideas check the schema and file down , please
... (0 Replies)
I am struggling here to understand......
The default encoding.
See photo 1.
Why does this NOT work?
#!/bin/bash
# Code for OSX 10.13.5. default UNICODE encoding.
echo""
echo "The default UTF-8..."
locale
echo""
echo "Change to 8 bit ASCII only..."
LANG="en_GB.US-ASCII"
export... (2 Replies)
Discussion started by: wisecracker
2 Replies
LEARN ABOUT NETBSD
kcpuset_clear
KCPUSET(9) BSD Kernel Developer's Manual KCPUSET(9)NAME
kcpuset, kcpuset_create, kcpuset_destroy, kcpuset_copy, kcpuset_use, kcpuset_unuse, kcpuset_copyin, kcpuset_copyout, kcpuset_zero,
kcpuset_fill, kcpuset_set, kcpuset_clear, kcpuset_isset, kcpuset_iszero, kcpuset_match, kcpuset_merge, kcpuset_atomic_set,
kcpuset_atomic_clear -- dynamic kernel CPU sets
SYNOPSIS
#include <sys/kcpuset.h>
void
kcpuset_create(kcpuset_t **retkcp, bool zero);
void
kcpuset_destroy(kcpuset_t *kcp);
void
kcpuset_copy(kcpuset_t *dkcp, kcpuset_t *skcp);
void
kcpuset_use(kcpuset_t *kcp);
void
kcpuset_unuse(kcpuset_t *kcp, kcpuset_t **lst);
int
kcpuset_copyin(const cpuset_t *ucp, kcpuset_t *kcp, size_t len);
int
kcpuset_copyout(kcpuset_t *kcp, cpuset_t *ucp, size_t len);
void
kcpuset_zero(kcpuset_t *kcp);
void
kcpuset_fill(kcpuset_t *kcp);
void
kcpuset_set(kcpuset_t *kcp, cpuid_t cpu);
void
kcpuset_clear(kcpuset_t *kcp, cpuid_t cpu);
int
kcpuset_isset(kcpuset_t * kcp, cpuid_t cpu);
bool
kcpuset_iszero(kcpuset_t *kcp);
bool
kcpuset_match(const kcpuset_t *kcp1, const kcpuset_t *kcp2);
void
kcpuset_merge(kcpuset_t *kcp1, kcpuset_t *kcp2);
void
kcpuset_atomic_set(kcpuset_t *kcp, cpuid_t cpu);
void
kcpuset_atomic_clear(kcpuset_t *kcp, cpuid_t cpu);
DESCRIPTION
The machine-independent kcpuset subsystem provides support for dynamic processor sets. Conceptually kcpuset can be understood to be the ker-
nel equivalent of the user space cpuset(3) interface.
FUNCTIONS
kcpuset_create(retkcp, zero)
The kcpuset_create() function creates a dynamic CPU set and stores the result to retkcp. If the boolean zero is not false, the
allocated set is also initialized to zero.
kcpuset_destroy(kcp)
Destroys the CPU set kcp and schedules any linked CPU sets for deferred destruction.
kcpuset_copy(dkcp, skcp)
Copies the CPU set pointed by skcp to dkcp.
kcpuset_use(kcp)
Marks kcp as being in use by increasing the reference count of the object. Note that initially kcpuset_create() sets the reference
count to 1.
kcpuset_unuse(kcp, lst)
Decreases the internal reference count of kcp, and on the last reference (when the count reaches zero), destroys kcp. If lst is not
NULL, then instead of destroying, kcp will be added to the lst list for a deferred destruction.
kcpuset_copyin(ucp, kcp, len)
Copies the len bytes long user-space CPU set ucp to the kernel CPU set kcp.
kcpuset_copyout(kcp, ucp, len)
Copies the kernel CPU set kcp to the user-space CPU set ucp.
kcpuset_zero(kcp)
Clears the set kcp.
kcpuset_fill(kcp)
Fills the whole set kcp with ones.
kcpuset_set(kcp, cpu)
Adds cpu to the set kcp.
kcpuset_clear(kcp, cpu)
Removes cpu from the set kcp.
kcpuset_isset(kcp, cpu)
Returns 1 if cpu is part of the CPU set kcp.
kcpuset_iszero(kcp)
Returns true if the set kcp is empty.
kcpuset_match(kcp1, kcp2)
Compares the sets kcp1 and kcp2, returning true if these are identical.
kcpuset_merge(kcp1, kcp2)
Merges the set kcp2 to the set kcp1.
kcpuset_atomic_set(kcp, cpu)
The kcpuset_atomic_set() function operates as kcpuset_set(), but the operation is atomic; see atomic_ops(3) for more details.
kcpuset_atomic_clear(kcp, cpu)
Removes cpu from the CPU set kcp atomically.
CODE REFERENCES
The kcpuset subsystem is implemented within sys/kern/subr_kcpuset.c.
SEE ALSO cpuset(3)HISTORY
The kcpuset subsystem first appeared in NetBSD 6.0.
BSD October 6, 2011 BSD