hadoop hdfs的一些用法
Example 3-1. Displaying files from a Hadoop filesystem on standard output using a
URLStreamHandler
//Reading Data from a Hadoop URL public class URLCat { static { URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory()); } public static void main(String[] args) throws Exception { InputStream in = null; try { in = new URL(args[0]).openStream(); IOUtils.copyBytes(in, System.out, 4096, false); } finally { IOUtils.closeStream(in); } } } ----------------------------------------- result: Here’s a sample run: % hadoop URLCat hdfs://localhost/user/tom/quangle.txt On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat.
Example3-2.DisplayingfilesfromaHadoopfilesystemonstandardoutputbyusingtheFileSystem
directly
public class FileSystemCat { public static void main(String[] args) throws Exception { String uri = args[0]; Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(uri), conf); InputStream in = null; try { in = fs.open(new Path(uri)); IOUtils.copyBytes(in, System.out, 4096, false); } finally { IOUtils.closeStream(in); } } } ------------------------------------------ The program runs as follows: % hadoop FileSystemCat hdfs://localhost/user/tom/quangle.txt On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat. The
Example3-3isasimpleextensionofExample3-2thatwritesafiletostandardout
twice:afterwritingitonce,itseekstothestartofthefileandstreamsthroughitonce
again.
//Example 3-3. Displaying files from a Hadoop filesystem on standard output twice, by using seek public class FileSystemDoubleCat { public static void main(String[] args) throws Exception { String uri = args[0]; Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(uri), conf); //通过get()方法获得一个FileSystem流 FSDataInputStream in = null; try { in = fs.open(new Path(uri)); //通过open()方法打开一个FSDataInputStream流 IOUtils.copyBytes(in, System.out, 4096, false); in.seek(0); // go back to the start of the file IOUtils.copyBytes(in, System.out, 4096, false); } finally { IOUtils.closeStream(in); } } } ---------------------------------------------------- Here’s the result of running it on a small file: % hadoop FileSystemDoubleCat hdfs://localhost/user/tom/quangle.txt On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat. On the top of the Crumpetty Tree The Quangle Wangle sat, But his face you could not see, On account of his Beaver Hat.
Example3-4showshowtocopyalocalfiletoaHadoopfilesystem.Weillustrateprogress
byprintingaperiodeverytimetheprogress()methodiscalledbyHadoop,which
isaftereach64Kpacketofdataiswrittentothedatanodepipeline.(Notethatthis
particularbehaviorisnotspecifiedbytheAPI,soitissubjecttochangeinlaterversions
ofHadoop.TheAPImerelyallowsyoutoinferthat“somethingishappening.”)
//Example 3-4. Copying a local file to a Hadoop filesystem, and shows progress public class FileCopyWithProgress { public static void main(String[] args) throws Exception { String localSrc = args[0]; String dst = args[1]; InputStream in = new BufferedInputStream(new FileInputStream(localSrc)); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(URI.create(dst), conf); OutputStream out = fs.create(new Path(dst), new Progressable() { public void progress() { System.out.print("."); } }); IOUtils.copyBytes(in, out, 4096, true); } } Typical usage: % hadoop FileCopyWithProgress input/docs/1400-8.txt hdfs://localhost/user/tom/1400-8.txt ...............