Sponsored Content
Top Forums Shell Programming and Scripting Remove first meta key from json records using shell Post 303028921 by Cloud_Ninja on Friday 18th of January 2019 03:27:29 AM
Old 01-18-2019
Thank you for quick reply, however this is not working as expected in my case , I have nested loop so it has deleted all together the big chunk of first level data.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Generating key values for leader records

All, I have a file with text as shown below. I want the o/p file with generated values in the first column as shown in the o/p file. Pls note that the size of my file is 6 GB. How do i do this ? Input file 999999abcdef 999999ghijkl 999999mnopq 777777rosesarered 777777skyisblue Output... (1 Reply)
Discussion started by: ajfaq
1 Replies

2. Shell Programming and Scripting

How to delete duplicate records based on key

For example suppose I have a file which contains data as: $cat data 800,2 100,9 700,3 100,9 200,8 100,3 Now I want the output as 200,8 700,3 800,2 Key is first three characters, I don't want any reords which are having duplicate keys. Like sort +0.0 -0.3 data can we use... (9 Replies)
Discussion started by: sumitc
9 Replies

3. Shell Programming and Scripting

How can i use shell meta characters like * ,$ <,> ?

hey guys!! please tell me ...how can i use shell meta characters like * ,$ <,> etc in command line in command line arguments literally please reply soon ..its urgent!!!! (8 Replies)
Discussion started by: tprayush
8 Replies

4. Shell Programming and Scripting

awk - splitting 1 large file into multiple based on same key records

Hello gurus, I am new to "awk" and trying to break a large file having 4 million records into several output files each having half million but at the same time I want to keep the similar key records in the same output file, not to exist accross the files. e.g. my data is like: Row_Num,... (6 Replies)
Discussion started by: kam66
6 Replies

5. UNIX for Advanced & Expert Users

howto remove meta info about MP4 or FLV file downloaded off Youtube?

Hi I tried a tool called mediainfo > brew info media-info media-info 0.7.51 http://mediainfo.sourceforge.net Depends on: pkg-config /usr/local/Cellar/media-info/0.7.51 (3 files, 14M) http://github.com/mxcl/homebrew/commits/master/Library/Formula/media-info.rb Got details from a test... (3 Replies)
Discussion started by: slashdotweenie
3 Replies

6. Shell Programming and Scripting

Removing specific records from files when duplicate key

Hello I have been trying to remove a row from a file which has the same first three columns as another row - I have tried lots of different combinations of suggestion on this forum but can't get it exactly right. what I have is 900 - 1000 = 0 900 - 1000 = 2562 1000 - 1100 = 0 1000 - 1100... (7 Replies)
Discussion started by: tinytimmay
7 Replies

7. Shell Programming and Scripting

Passing key column from parent to child records

Hi Forum. I have this challenging issue that I'm hoping someone can help me. I have a file that contains 3 different types of segments (AM00, AM01, AM32) in a hierarchy structure and I want to be able to pass the column key from the parent record to the children records. AM00 - parent key:... (13 Replies)
Discussion started by: pchang
13 Replies

8. UNIX for Beginners Questions & Answers

Json field grap via shell script/awk

i have a json data that looks like this: { "ip": "16.66.35.10", "hostname": "No Hostname", "city": "Stepney", "region": "England", "country": "GB", "loc": "51.57,-0.0333", "org": "AS6871 British Telecommunications PLC", "postal": "E1" } im looking for a way to assign... (9 Replies)
Discussion started by: SkySmart
9 Replies

9. Shell Programming and Scripting

Remove lines from multiline json

Hi , When extracting the data from API end point ,its giving multi line json .I want to remove certain lines with group": "tag" or tget and respect "item" values python test.py /data{" Id":" 7554317""group":"get", "item":"xx5e1"],"fields":} { "time": 1520460953, "... (4 Replies)
Discussion started by: akil
4 Replies

10. UNIX for Beginners Questions & Answers

How to convert any shell command output to JSON format?

Hi All, I am new to shell scripting, Need your help in creating a shell script which converts any unix command output to JSON format output. example: sample df -h command ouput : Filesystem size used avail capacity Mounted /dev/dsk/c1t0d0s0 8.1G 4.0G 4.0G 50% /... (13 Replies)
Discussion started by: balu1234
13 Replies
pack_fopen_chunk(3alleg4)					  Allegro manual					 pack_fopen_chunk(3alleg4)

NAME
pack_fopen_chunk - Opens a sub-chunk of a file. Allegro game programming library. SYNOPSIS
#include <allegro.h> PACKFILE *pack_fopen_chunk(PACKFILE *f, int pack); DESCRIPTION
Opens a sub-chunk of a file. Chunks are primarily intended for use by the datafile code, but they may also be useful for your own file rou- tines. A chunk provides a logical view of part of a file, which can be compressed as an individual entity and will automatically insert and check length counts to prevent reading past the end of the chunk. The PACKFILE parameter is a previously opened file, and `pack' is a bool- ean parameter which will turn compression on for the sub-chunk if it is non-zero. Example: PACKFILE *output = pack_fopen("out.raw", "w!"); ... /* Create a sub-chunk with compression. */ output = pack_fopen_chunk(output, 1); if (!output) abort_on_error("Error saving data!"); /* Write some data to the sub-chunk. */ ... /* Close the sub-chunk, recovering parent file. */ output = pack_fclose_chunk(output); The data written to the chunk will be prefixed with two length counts (32-bit, a.k.a. big-endian). For uncompressed chunks these will both be set to the size of the data in the chunk. For compressed chunks (created by setting the `pack' flag), the first length will be the raw size of the chunk, and the second will be the negative size of the uncompressed data. To read the chunk, use the following code: PACKFILE *input = pack_fopen("out.raw", "rp"); ... input = pack_fopen_chunk(input, 1); /* Read data from the sub-chunk and close it. */ ... input = pack_fclose_chunk(input); This sequence will read the length counts created when the chunk was written, and automatically decompress the contents of the chunk if it was compressed. The length will also be used to prevent reading past the end of the chunk (Allegro will return EOF if you attempt this), and to automatically skip past any unread chunk data when you call pack_fclose_chunk(). Chunks can be nested inside each other by making repeated calls to pack_fopen_chunk(). When writing a file, the compression status is inherited from the parent file, so you only need to set the pack flag if the parent is not compressed but you want to pack the chunk data. If the parent file is already open in packed mode, setting the pack flag will result in data being compressed twice: once as it is written to the chunk, and again as the chunk passes it on to the parent file. RETURN VALUE
Returns a pointer to the sub-chunked PACKFILE, or NULL if there was some error (eg. you are using a custom PACKFILE vtable). SEE ALSO
pack_fclose_chunk(3alleg4), pack_fopen(3alleg4) Allegro version 4.4.2 pack_fopen_chunk(3alleg4)
All times are GMT -4. The time now is 06:51 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy