Sponsored Content
Top Forums UNIX for Beginners Questions & Answers Json field grap via shell script/awk Post 302992293 by SkySmart on Thursday 23rd of February 2017 02:17:50 PM
Old 02-23-2017
Json field grap via shell script/awk

i have a json data that looks like this:

Code:
{
  "ip": "16.66.35.10",
  "hostname": "No Hostname",
  "city": "Stepney",
  "region": "England",
  "country": "GB",
  "loc": "51.57,-0.0333",
  "org": "AS6871 British Telecommunications PLC",
  "postal": "E1"
}

im looking for a way to assign each value from the above to a variable without having to make multiple external calls to any particular tool like awk (although i love awk).

if awk can be used for this, i dont mind it.

but here's what im currently doing:


Code:
IpInfo=$(cat jsonfile | tr -d '\n')

ip=$(echo "${IpInfo}" | awk -F"," '{print $1}')
hname=$(echo "${IpInfo}" | awk -F"," '{print $2}')
city=$(echo "${IpInfo}" | awk -F"," '{print $3}')
region=$(echo "${IpInfo}" | awk -F"," '{print $4}')
country=$(echo "${IpInfo}" | awk -F"," '{print $5}')
location=$(echo "${IpInfo}" | awk -F"," '{print $6}')
organiz=$(echo "${IpInfo}" | awk -F"," '{print $7}')
postal=$(echo "${IpInfo}" | awk -F"," '{print $8}')

as you can see, this is very bad as im making 8 calls to awk.

how can i achieve what im trying to do?
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Split a field in awk script

Hi all, I have a field in the line, let's say argument $6, which is in the format 00.00 If i want to split the field to get rid of the "." in between of the amount, how can i do that i awk script? I have it like this split($6,a,".") but it will get rid of the last 2 digits after the... (4 Replies)
Discussion started by: CamTu
4 Replies

2. UNIX for Dummies Questions & Answers

Grap data/string from lines between words

Hi all, im newbie in unix, i have a case like this file name : RegisterSubscriber.log file value : <errId>0x0509000000000003</errId><HARs><ok/><affectEntity>510890905290059</affectEntity></HLRes></HRI> I want to grep the line which contain 0x0509000000000003, and i want to grep... (2 Replies)
Discussion started by: andrisetia
2 Replies

3. Shell Programming and Scripting

field seperator question (awk script)

Is there a way I could use different a different field seperator for different parts of the body? kinda like {FS = ":"} FILENAME == "products"{ price = $3 if(numprods < $1-100) numprods = $1-100 } {FS = "/"}{} FILENAME == "associates"{ associateid... (5 Replies)
Discussion started by: angermanaged
5 Replies

4. Shell Programming and Scripting

awk script to split field data

Hi Experts, I have a Input.txt document which contains data fields seperated by tabs. There are 4 fields totally Named UNIQUE, ORDER, CONTACT and WINS. The UNIQUE field contains unique ID and the CONTACT field contains data seperated by comma in some records. I am looking to write an awk script... (12 Replies)
Discussion started by: forumthreads
12 Replies

5. Shell Programming and Scripting

AWK How to replace a field using 2 shell variables?

Hello everybody: I want to replace any field $2 of any file line (f.i. test.txt) matching $1 with a shell variable. $ cat test.txt F 0 B A H -12.33 Now I'm going to ask the value of variable B: $ SEARCHVAR=B $ OLDVAL=$(awk -v SEARCHVAR="$SEARCHVAR"... (4 Replies)
Discussion started by: basalt
4 Replies

6. Shell Programming and Scripting

Save awk record field in bourne shell variable

Hello, I am trying to write a shell script that maintains the health of the passwd file. The goal is to check for duplicate usernames, UID's etc. I am able to find and sort out the UID and login names via awk (which I would like to use), but I can't figure out how to save the record field into a... (1 Reply)
Discussion started by: Learn4Life
1 Replies

7. Shell Programming and Scripting

Remove first meta key from json records using shell

Hi All, I need to get rid of initial meta key from json files with enclosed parenthesis from start and end of the lines which has total 4000 lines. here is the sample Json records : {"start": true, "meta": {"name": "xyz", "creation": "2017-07-14T16:20:06.000+02:00"}} I need to remove... (7 Replies)
Discussion started by: Cloud_Ninja
7 Replies

8. Shell Programming and Scripting

JSON structure to table form in awk, bash

Hello guys, I want to parse a JSON file in order to get the data in a table form. My JSON file is like this: { "document":{ "page": }, { "column": } ] }, { ... (6 Replies)
Discussion started by: Gescad
6 Replies

9. UNIX for Beginners Questions & Answers

Convert String to an Array using shell scripting in JSON file.

This is the sample json I have pasted here. I want all the IP address strings to be converted into an array. For example "10.38.32.202" has to be converted to everywhere in the JSON. There are multiple IPs in a JSON I am pasting one sample object from the JSON. But the IPs already in an Array... (11 Replies)
Discussion started by: vinshas1
11 Replies

10. UNIX for Beginners Questions & Answers

How to convert any shell command output to JSON format?

Hi All, I am new to shell scripting, Need your help in creating a shell script which converts any unix command output to JSON format output. example: sample df -h command ouput : Filesystem size used avail capacity Mounted /dev/dsk/c1t0d0s0 8.1G 4.0G 4.0G 50% /... (13 Replies)
Discussion started by: balu1234
13 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 12:48 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy