Sales Tel: +63 945 7983492  |  Email Us    
SMDC Residences

Air Residences

Features and Amenities

Reflective Pool
Function Terrace
Seating Alcoves

Air Residences

Green 2 Residences

Features and Amenities:

Wifi ready study area
Swimming Pool
Gym and Function Room

Green 2 Residences

Bloom Residences

Features and Amenities:

Recreational Area
2 Lap Pools
Ground Floor Commercial Areas

Bloom Residences

Leaf Residences

Features and Amenities:

3 Swimming Pools
Gym and Fitness Center
Outdoor Basketball Court

Leaf Residences

Contact Us

Contact us today for a no obligation quotation:

+63 945 7983492
+63 908 8820391

Copyright © 2018 SMDC :: SM Residences, All Rights Reserved.

I10-001 dumps with Real exam Questions and Practice Test -

Great Place to download 100% free I10-001 braindumps, real exam questions and practice test with VCE exam simulator to ensure your 100% success in the I10-001 -

Pass4sure I10-001 dumps | I10-001 real questions |

I10-001 XML Master Basic V2

Study Guide Prepared by XML-Master Dumps Experts

Exam Questions Updated On : I10-001 Dumps and Real Questions

100% Real Questions - Exam Pass Guarantee with High Marks - Just Memorize the Answers

I10-001 exam Dumps Source : XML Master Basic V2

Test Code : I10-001
Test Name : XML Master Basic V2
Vendor Name : XML-Master
: 129 Real Questions

I were given great Questions financial institution for my I10-001 exam.
the exact answers were not tough to recall. My know-how of emulating the was clearly appealing, as I made all right replies inside the exam I10-001. much appreciated to the for the assist. I advantageously took the exam preparation inner 12 days. The presentation fashion of this aide become simple without any lengthened solutions or knotty clarifications. some of the topic which are so hard and hard as nicely are train so fantastically.

found an real source for actual I10-001 examination questions.
It became simply 12 days to strive for the I10-001 exam and i used to be loaded with a few points. i was looking for a easy and effective manual urgently. eventually, I got the of killexams. Its brief solutions have been no longer hard to complete in 15 days. in the real I10-001 exam, I scored 88%, noting all of the questions in due time and got 90% questions just like the pattern papers that they supplied. an awful lot obliged to killexams.

surprised to peer I10-001 ultra-cutting-edge dumps!
I have been the use of the for a while to all my tests. Last week, I handed with a exquisite marks in the I10-001 exam by the usage of the test resources. I had some doubts on subjects, but the material cleared all my doubts. I have without troubles determined the answer for all my doubts and issues. Thanks for imparting me the sturdy and dependable dump. It is the great product as I understand.

Unbelieveable! But true source of I10-001 real test questions.
I had taken the I10-001 coaching from the as that have become a nice platform for the training and that had ultimately given me the quality stage of the practise to get the top class rankings within the I10-001 check test. I certainly loved the way I had been given the things carried out within the exciting way and through the assist of the identical; I had ultimately had been given the thing on the road. It had made my steering a high-quality deal easier and with the assist of the I have been capable of grow nicely in the life.

Is there I10-001 exam new sayllabus?
I am now I10-001 certified and it could not be possible without I10-001 exam simulator. exam simulator has been tailored keeping in brain the requirements of the students which they confront at the time of taking I10-001 exam. This exam simulator is very much exam focus and every topic has been addressed in detail just to keep apprised the students from each and every information. team knows that this is the way to keep students confident and ever ready for taking exam.

simply strive real I10-001 test questions and achievement is yours.
This is an truly valid and reliable useful resource, with actual I10-001 questions and accurate answers. The exam simulator works very clean. With extra info and top customer support, that is an exceptionally top provide. No loose random braindumps available on line can compare with the pleasant and the coolest revel in I had with Killexams. I passed with a virtually high marks, so Im telling this based on my personal enjoy.

Just try these Latest dumps and success is yours.
I cleared all the I10-001 tests effortlessly. This website proved very beneficial in clearing the tests as well as expertise the ideas. All questions are explanined very well.

I10-001 actual question bank is actual look at, genuine result.
hi team, i have finished I10-001 in first strive and thanks a lot to your beneficial questions bank.

Very complete and true brand new I10-001 examination.
Hi all, please be informed that I have passed the I10-001 exam with, which was my main preparation source, with a solid average score. This is a very valid exam material, which I highly recommend to anyone working towards their IT certification. This is a reliable way to prepare and pass your IT exams. In my IT company, there is not a person who has not used/seen/heard/ of the materials. Not only do they help you pass, but they ensure that you learn and end up a successful professional.

simply study those modern-day dumps and success is yours. changed into very refreshing entry in my lifestyles, mainly because the dump that I used via this killexams.coms assist turned into the only that got me to clear my I10-001 exam. Passing I10-001 exam isnt easy but it turned into for me due to the fact I had get right of entry to to the great analyzing dump and im immensely grateful for that.

XML-Master XML Master Basic V2

gaining knowledge of Composer – counsel and hints | Real Questions and Pass4sure dumps

Composer has revolutionized equipment management in personal home page. It upped the reusability video game and helped php builders all over the place the realm generate framework agnostic, thoroughly shareable code. but few people ever go past the fundamentals, so this post will cover some constructive suggestions and hints.


although it’s naturally defined in the documentation, Composer can (and in most situations should) be put in globally. global installation ability that as a substitute of typing out

php composer.phar somecommand

you could just type out

composer somecommand

in any project in anyway. This makes starting new initiatives with, as an instance, the create-task command useless convenient in any region on your filesystem.

To set up Composer globally, comply with these guidelines.


To create a brand new composer.json file in a task (and consequently initialize a new Composer-powered challenge), that you may use:

composer init

that you can also circulate in some alternatives as defaults.

installing applications the right means

When studying tutorials or README info of initiatives, many will say anything like:

just add right here to your composer.json file: "require": "myproject": "someversion"

but this has a few downsides. One, the copy-pasting might also introduce some blunders. Two, for a newbie, finding out where to vicinity the code if you have already got an intensive composer.json file in your task will also be tedious and additionally introduce errors. ultimately, many people will be encountering Composer for the first time and in a command line, so covering the entire use instances by which they may locate themselves isn’t possible (do they have a GUI text editor or are they on the command line? If it’s the latter, do they have a text editor installed, and if so which? Do you explain the editing process or just go away it? What if the file doesn’t exist of their projects? in case you cover the introduction of the file, too?).

The premiere approach to add a new requirement to a composer.json file is with the require command:

composer require somepackage/somepackage:someversion

This provides every little thing that’s crucial into the file, bypassing all guide intervention.

if you need to add packages to require-dev, add the --dev option, like so:

composer require phpunit/phpunit --dev

The require command supports adding a number of packages at once, simply separate them with an area. observe that there is no deserve to specify the edition in this method, as viewed in the code snippet above – installing a kit this way automatically grabs essentially the most contemporary edition of a kit, and tells you which of them one it picked.

Lock info

The composer.lock file saves the list of at the moment put in applications, so that when a different adult clones your undertaking at a date when the dependencies may additionally had been up to date, they still get the historical models installed. This helps make sure all and sundry who grabs your project has the actual identical kit atmosphere as you probably did when the challenge became developed, keeping off any bugs that may were created because of version updates.

composer.lock should nearly always be committed to edition control. perhaps.

composer.lock additionally contains the hash of the composer.json file, so if you update simply the assignment author, or some contact info, or an outline, you’ll get a warning in regards to the lock file now not matching the json file – when that’s the case, working composer update --lock will support issues, updating handiest the lock file and never touching the rest.

version flags

When defining package models, you will use exact matches (1.2.3), ranges with operators (<1.2.3), combos of operators (>1.2.three <1.three), most advantageous available (1.2.*), tilde (~1.2.three) and caret (^1.2.3).

The latter two may warrant further explanation:

  • tilde (~1.2.3) will go up to version 1.three (now not included), as a result of in semantic versioning that’s when new points get introduced. Tilde fetches the maximum standard good minor edition. as the doctors say, they are able to believe it as best the remaining digit special being allowed to trade.

  • caret (^1.2.three) potential “simplest be careful of breaking alterations”, and may as a consequence go as much as version 2.0. in accordance with semver, that’s when breaking changes are added, so 1.3,1.four and 1.9 are nice, while 2.0 isn't.

  • unless you recognize you want a specific edition, i recommend all the time the use of the ~1.2.3 format – it’s your most secure guess.

    Configuration and world Configuration

    The default values are not fixed in stone. See the whole config reference for particulars.

    for example, by way of specifying:

    "config": "optimize-autoloader": true

    you force Composer to optimize the classmap after each installing/update, or in other phrases, each time the autoload file is being generated. here is a bit bit slower than generating the default autoloader, and slows down as the challenge grows.

    one other positive alternative could be the cache-data-maxsize – in enormous tasks like eZ publish or Symfony, the cache could get full fairly quickly. expanding the size would keep Composer fast longer.

    note that configuration will also be set globally, too, so it’s consistent throughout projects. See right here for how. for instance, to add the cache size atmosphere to their world configuration, they either edit ~/.composer/config.json or execute:

    composer config --international cache-data-maxsize "2048MiB" Profile and Verbose

    that you would be able to add a --profile flag to any command you execute on the command line with Composer, and it’ll produce now not best a ultimate output like this:

    [174.6MB/54.70s] memory utilization: 174.58MB (peak: 513.47MB), time: 54.7s

    but also prefix each line it outputs with the accurate total period of the command’s execution to this point, plus the reminiscence utilization:

    [175.9MB/54.64s] installation property for Sensio\Bundle\DistributionBundle into net/bundles/sensiodistribution

    i take advantage of this command regularly to establish the bottleneck programs and to study how the stats increase or degrade on diverse models of php.

    Likewise, the --verbose flag will make certain Composer outputs greater assistance with each and every operation it performs, helping you understand exactly what’s happening. Some americans have even aliased their composer command to consist of composer --verbose --profile through default.

    customized Sources

    sometimes, you simply want to set up from a Github repo if your project isn’t yet on Packagist. might be it’s beneath building, perhaps it’s in the neighborhood hosted, who is aware of. To do this, see their book.

    Likewise, when you have your own version of a popular assignment that an extra part of your task is dependent upon, that you could use custom sources in aggregate with inline aliasing to false the version constraint like Matthieu Napoli did here.

    dashing up Composer

    As per this marvelous trick by Mark Van Eijk, that you may pace up Composer’s execution by way of making it run on HHVM.

    a different approach is forcing it to use --pick-dist which downloads a strong, packaged edition of a challenge as opposed to cloning it from the version handle gadget it’s on (tons slower). here's on through default, though, so you shouldn’t need to specify it on sturdy projects. in case you wish to down load the sources, use the --pick-source flag. greater information about this within the alternatives of the deploy command right here.

    Making your Composer undertaking lighter

    if you’re someone who develops Composer-friendly projects, you may want to do your half, too. in response to this Reddit thread, which you could use a .gitattributes file to disregard some of the files and folders all over packaging for the --decide upon-dist mode above.

    /doctors export-ignore /tests export-ignore /.gitattributes export-ignore /.gitignore export-ignore /.travis.yml export-ignore /phpunit.xml export-ignore

    How does this work? if you upload a assignment to Github, it immediately makes accessible the “download zip” button which which you could use to down load an archive of your undertaking. What’s extra, Packagist uses these auto-generated archives to tug in the --prefer-dist dependencies, after which unarchives them once downloaded (a good deal faster than cloning). if you for that reason ignore your assessments, docs and different logically irrelevant information by way of listing them in .gitattributes, the archives gained’t comprise them, fitting an awful lot, a lot lighter.

    Naturally, individuals who need to debug your library or run its tests may still then specify the --choose-source flag.

    The PhpLeague has adopted this strategy and included it in their kit skeleton, so any task in accordance with it is immediately “dist friendly”.


    in case you ever overlook what version of Hypertext Preprocessor or its extensions you’re running, or need an inventory of the entire initiatives (and their descriptions) that you just’ve put in internal the existing project and their models, you could use the exhibit command with the --platform (brief -p) and --put in (brief -i) flags respectively:

    $ composer show --installed behat/behat v3.0.15 situation-oriented BDD framework for php 5.three behat/gherkin v4.three.0 Gherkin DSL parser for personal home page 5.3 behat/mink v1.5.0 net acceptance checking out framework for Hypertext Preprocessor 5.3 behat/mink-browserkit-driver v1.1.0 Symfony2 BrowserKit driver for Mink framework behat/mink-extension v2.0.1 Mink extension for Behat behat/mink-goutte-driver v1.0.9 Goutte driver for Mink framework behat/mink-sahi-driver v1.1.0 Sahi.JS driver for Mink framework behat/mink-selenium2-driver v1.1.1 Selenium2 (WebDriver) driver for Mink framework behat/sahi-customer dev-grasp ce7bfa7 Sahi.js client for Hypertext Preprocessor 5.3 behat/symfony2-extension v2.0.0 Symfony2 framework extension for Behat behat/transliterator v1.0.1 String transliterator add-ons/bootstrap three.3.2 essentially the most generic front-conclusion framework for constructing responsive, cellular first projects on the net. add-ons/jquery 2.1.3 jQuery JavaScript Library doctrine/annotations v1.2.4 Docblock Annotations Parser doctrine/cache v1.four.1 Caching library providing an object-oriented API for a lot of cache backends doctrine/collections v1.3.0 Collections Abstraction library doctrine/typical v2.5.0 usual Library for Doctrine tasks doctrine/dbal v2.5.1 Database Abstraction Layer doctrine/doctrine-bundle v1.4.0 Symfony DoctrineBundle doctrine/doctrine-cache-bundle v1.0.1 Symfony2 Bundle for Doctrine Cache doctrine/inflector v1.0.1 ordinary String Manipulations with reference to casing and singular/plural suggestions. doctrine/instantiator 1.0.4 A small, light-weight utility to instantiate objects in php devoid of invoking their constructors doctrine/lexer v1.0.1 Base library for a lexer that can also be used in suitable-Down, Recursive Descent Parsers. egulias/listeners-debug-command-bundle 1.9.1 Symfony 2 console command to debug listeners ezsystems/behatbundle dev-grasp bd95e1b Behat bundle for support trying out eZ Bundles and initiatives ezsystems/feedback-bundle dev-grasp 8f95bc7 Commenting system for eZ put up ezsystems/demobundle dev-grasp c13fb0b Demo bundle for eZ post Platform ezsystems/demobundle-facts v0.1.0 statistics for ezsystems/demobundle ezsystems/ezpublish-kernel dev-master 3d6e48d eZ post API and kernel. this is the coronary heart of eZ put up 5. ezsystems/platform-ui-belongings-bundle v0.5.0 exterior belongings dependencies for PlatformUIBundle ezsystems/platform-ui-bundle dev-grasp 4d0442d eZ Platform UI package dealezsystems/privateness-cookie-bundle v0.1 privacy cookie banner integration bundle into eZ submit/eZ Platform fabpot/goutte v1.0.7 an easy personal home page web Scraper friendsofsymfony/http-cache 1.3.1 tools to control cache invalidation friendsofsymfony/http-cache-bundle 1.2.1 Set direction primarily based HTTP cache headers and ship invalidation requests to your HTTP cache guzzle/guzzle v3.9.3 php HTTP client. This library is deprecated in favor of hautelook/templated-uri-bundle 2.0.0 Symfony2 Bundle that provides a RFC-6570 suitable router and URL Generator. hautelook/templated-uri-router 2.0.1 Symfony2 RFC-6570 compatible router and URL Generator think about/think about 0.6.2 picture processing for personal home page 5.three incenteev/composer-parameter-handler v2.1.0 Composer script coping with your omitted parameter fileinstaclick/php-webdriver 1.0.17 php WebDriver for Selenium 2 jdorn/sql-formatter v1.2.17 a Hypertext Preprocessor SQL highlighting library knplabs/knp-menu v1.1.2 An object oriented menu library knplabs/knp-menu-bundle v1.1.2 This bundle gives an integration of the KnpMenu library kriswallsmith/assetic v1.2.1 Asset administration for php kriswallsmith/buzz v0.13 lightweight HTTP customer league/flysystem 0.5.12 Many filesystems, one API. liip/imagine-bundle 1.2.6 This Bundle assists in imagine manipulation the use of the think about library monolog/monolog 1.13.1 Sends your logs to files, sockets, inboxes, databases and quite a few internet facilitiesnelmio/cors-bundle 1.three.3 adds CORS (cross-beginning aid Sharing) headers support to your Symfony2 utility ocramius/proxy-manager 0.5.2 A library offering utilities to generate, instantiate and customarily function with Object Proxies oneup/flysystem-bundle v0.4.2 Integrates Flysystem filesystem abstraction library to your Symfony2 undertaking. pagerfanta/pagerfanta v1.0.3 Pagination for personal home page 5.three phpdocumentor/reflection-docblock 2.0.four phpspec/prophecy v1.four.1 enormously opinionated mocking framework for php 5.3+ phpunit/php-code-insurance 2.0.sixteen Library that provides collection, processing, and rendering performance for personal home page code coverage tips. phpunit/Hypertext Preprocessor-file-iterator 1.four.0 FilterIterator implementation that filters files based on an inventory of suffixes. phpunit/personal home page-textual content-template 1.2.0 elementary template engine. phpunit/php-timer 1.0.5 Utility class for timing phpunit/personal home page-token-circulation 1.4.1 Wrapper around php's tokenizer extension. phpunit/phpunit 4.6.four The personal home page Unit trying out framework. phpunit/phpunit-mock-objects 2.3.1 Mock Object library for PHPUnit psr/log 1.0.0 general interface for logging libraries qafoo/rmf 1.0.0 Very elementary VC framework which makes it effortless to build HTTP purposes / relaxation webservices sebastian/comparator 1.1.1 provides the performance to examine Hypertext Preprocessor values for equality sebastian/diff 1.3.0 Diff implementation sebastian/environment 1.2.2 provides functionality to tackle HHVM/Hypertext Preprocessor environments sebastian/exporter 1.2.0 offers the performance to export personal home page variables for visualizationsebastian/global-state 1.0.0 Snapshotting of world state sebastian/recursion-context 1.0.0 gives performance to recursively system php variables sebastian/edition 1.0.5 Library that helps with managing the version number of Git-hosted Hypertext Preprocessor projects sensio/distribution-bundle v3.0.21 Base bundle for Symfony Distributions sensio/framework-additional-bundle v3.0.7 This bundle offers a way to configure your controllers with annotations sensio/generator-bundle v2.5.3 This bundle generates code for you sensiolabs/protection-checker v2.0.2 A protection checker for your composer.lock swiftmailer/swiftmailer v5.four.0 Swiftmailer, free characteristic-rich personal home page mailer symfony-cmf/routing 1.three.0 Extends the Symfony2 routing element for dynamic routes and chaining a couple of routers symfony/assetic-bundle v2.6.1 Integrates Assetic into Symfony2 symfony/monolog-bundle v2.7.1 Symfony MonologBundle symfony/swiftmailer-bundle v2.three.8 Symfony SwiftmailerBundle symfony/symfony v2.6.6 The Symfony php framework tedivm/stash v0.12.3 The place to preserve your cache. tedivm/stash-bundle v0.four.2 comprises the Stash caching library into Symfony. twig/extensions v1.2.0 typical additional aspects for Twig that don't at once belong in coretwig/twig v1.18.1 Twig, the flexible, fast, and at ease template language for personal home page white-october/pagerfanta-bundle v1.0.2 Bundle to use Pagerfanta with Symfony2 whiteoctober/breadcrumbs-bundle 1.0.2 A small breadcrumbs bundle for Symfony2 zendframework/zend-code 2.2.10 offers amenities to generate arbitrary code the use of an object oriented interface zendframework/zend-eventmanager 2.2.10 zendframework/zend-stdlib 2.2.10 zetacomponents/base 1.9 the base equipment gives the basic infrastructure that every one packages count on. hence every element depends on this equipment. zetacomponents/feed 1.4 This element handles parsing and growing RSS1, RSS2 and ATOM feeds, with aid for distinctive feed modules (dc, content material, creativeCommons, geo, iTunes). zetacomponents/mail 1.8.1 The element allows for you construct and/or parse Mail messages conforming to the mail usual. It has support for attachments, multipart messages and HTML mail. It also interfaces with SMTP to send mail or IMAP, P... zetacomponents/equipment-advice 1.1 gives access to usual system variables, akin to CPU category and pace, and the obtainable quantity of reminiscence. Dry Runs

    To simply see if an installing of latest necessities would go well, that you would be able to use the --dry-run flag with Composer’s install and replace command. this may throw the entire potential problems at you, with out in reality inflicting them – no changes will in reality be made. superb for testing huge requirement and setup alterations earlier than definitely committing to them.

    composer replace --dry-run --profile --verbose Create assignment

    ultimate but not least, they must mention the create-challenge command, relevant to anything and every thing.

    Create challenge takes a package identify as the argument, then clones the package and executes composer install inner it. here's marvelous for bootstrapping projects – no more checking out the exact Github URL of the equipment you want, then cloning, then manually going into the folder and executing install.

    primary projects similar to Symfony and Laravel use this strategy to bootstrap a skeleton utility, and a lot of others are jumping on board.

    With Laravel, for instance, it’s used like this:

    composer create-venture laravel/laravel --pick-dist --profile --verbose

    The create-project command also accepts two parameters. the primary is the path into which to deploy. If not noted, the undertaking’s identify is used. The 2d is the version. If omitted, the newest edition is used.


    Hope this list of guidance and tricks has been effective! If they ignored some, do inform us and we’ll update the put up! And bear in mind – if you ignore one of the most commands or switches, simply check out the cheatsheet. satisfied Composing!

    Bruno is a blockchain developer and code auditor from Croatia with grasp’s levels in computer Science and English Language and Literature. he's been a web developer for 10 years unless JavaScript drove him away. He now runs a cryptocurrency company at by means of which he makes blockchain tech approachable to the loads, and runs Coinvendor, an on-boarding platform for people to effectively purchase cryptocurrency. He’s also a developer evangelist for, a San Francisco-based mostly AI-powered computing device vision internet scraper.

    What's in the box? Interrogate Your Linux machine's Hardware | Real Questions and Pass4sure dumps

    I lately had a problem attempting to installation the NVIDIA driver for my computer. It gave the impression the newest driver had stopped helping my pics card, and after updating my kernel, i was out of a driver. The query, without doubt, became "which card did I have?" however, I did not be aware. in case you must name the chipset of your motherboard, specify the CPU for your box or get some other variety of hardware-linked tips, Linux offers a couple of utilities to aid you. In my case, I at once might get the entire identification of my pictures card, verify that it definitely turned into getting a little bit long in the enamel and choose that a more recent one wasn't such a nasty theory.

    in this article, I discuss several methods of getting hardware data on your laptop. in the most usual Linux shell way, I exhibit how to work with a few command-line utilities, but if you prefer the usage of a GUI, I also consist of some graphical programs. And, if you wish to get into the nitty-gritty particulars, I supply some tips on the way to get that counsel through the use of the /proc or /sys filesystem.

    word list

    Working with hardware capacity dealing with several acronyms, and i must admit, I had been the usage of at the least a few them with out remembering precisely what they intended. here's a listing of definitions you're going to certainly need:

  • ACPI (superior Configuration and vigour Interface): regarding power aspects.

  • AGP (Accelerated snap shots Port): a channel to enable attaching a video pictures card (not typically considered considering round 2008).

  • APM (advanced power management): older than ACPI, also concerning energy concerns.

  • ATA (AT Attachment): "AT", as in the historical IBM AT, a typical to join storage instruments, superseded via SATA in 2003.

  • BIOS (simple input/Output gadget): firmware used when booting an Intel-suitable workstation.

  • DMA (Direct reminiscence entry): a characteristic that makes it possible for giving hardware entry to RAM, independently of the CPU.

  • DMI (laptop management Interface): a framework for keeping song of contraptions in a laptop.

  • IDE (built-in pressure Electronics): an interface usual that later developed into ATA.

  • IRQ (Interrupt ReQuest): a hardware signal that permits an interrupt handler to procedure a given experience.

  • PCI (Peripheral part Interconnect): a bus commonplace for attaching assorted hardware gadgets to a laptop, created in 1992.

  • UEFI (Unified EFI—Extensible Firmware Interface): a 2005 substitute for BIOS, which deprecated the previous 1998 EFI usual.

  • USB (common Serial Bus): a common bus described in 1995 to permit connecting all kinds of peripherals to a computer.

  • PATA (Parallel ATA): the new name for ATA, after SATA came out.

  • PCIe (PCI express): a high-pace serial bus that changed PCI and AGP in 2004.

  • RAID (Redundant Array of impartial—originally, "good value"—Disks): a knowledge storage virtualization know-how that combines several drives to work as a single one for performance improvement and/or statistics redundancy. There are several RAID schemes, together with RAID 0 ("striping"), RAID 1 ("mirroring"), RAID 5 ("striping + parity") and RAID 10 ("striping + mirroring").

  • SATA (Serial ATA): a bus interface to connect storage contraptions, at the moment used in practically all PCs.

  • SCSI (Small desktop device Interface—suggested "scuzzy"): a set of requirements for connection of instruments and transfer of information between computer systems and peripherals.

  • The ls Command family

    Let's start the command-line work with a collection of a number of utilities, whose names all delivery with ls (table 1). Some of these commands deliver overlapping information (lsdev or lshw, for example), however through the use of all of them, that you could get a gorgeous clear thought of anything may well be inner your Linux container.

    desk 1. The ls* family of commands lets you entry all features of your hardware. Command Description lsblk Produces advice about all block instruments, akin to challenging disks, DVD readers and more. lscpu indicates suggestions like number of CPUs, cores, threads and extra. lsdev shows records about all instruments of which the gadget is conscious. lshw Lists accepted hardware statistics—offers counsel on every aspect of your hardware. lspci shows advice about PCI buses in your box and gadgets linked to them, equivalent to pics playing cards, community adapters and extra. lsscsi provides tips on all SCSI gadgets or hosts connected to your field, reminiscent of hard disk drives or optical drives. lsusb Generates tips about USB buses in your box and devices linked to them.

    Let's start with CPU information. The lscpu command gives information on the CPUs in your box. which you could choose to consist of all CPUs, even if off-line or on line, with the -.all parameter, or you can opt for --on-line and --offline. The --parse choice permits you to select what CPU traits to listing, together with quantity, socket, cache statistics, optimum and minimum speed (in MHz) and greater. In my case, you're going to see that my computer has a a bit of historic single-socket, 4-core, Intel Core 2 Quad CPU, at 2.66GHz:

    > lscpu architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian CPU(s): 4 on line CPU(s) listing: 0-three Thread(s) per core: 1 Core(s) per socket: 4 Socket(s): 1 NUMA node(s): 1 supplier identity: GenuineIntel CPU family unit: 6 model: 23 mannequin identify: Intel(R) Core(TM)2 Quad CPU Q8400 @ 2.66GHz Stepping: 10 CPU MHz: 2003.000 CPU max MHz: 2670.0000 CPU min MHz: 2003.0000 BogoMIPS: 5340.sixty seven Virtualization: VT-x L1d cache: 32K L1i cache: 32K L2 cache: 2048K NUMA node0 CPU(s): 0-three

    (notice: you can get most of this assistance by means of inspecting the /proc/cpuinfo file or by way of browsing the /sys/bus/cpu/ directories; see the DIY with /proc and /sys sidebar for greater on this.)

    Let's stream on to block contraptions, reminiscent of complicated disks, or CD and DVD devices. The lsblk command produces tips on all accessible block gadgets (see list 1 for an instance). As that you can see, I even have three tough disks and a ROM (DVD) device. The three disks are referred to as /dev/sda, /dev/sdb and /dev/sdc; the ROM machine is /dev/sr0. The disks are 466GB, 149GB and a pair of.7TB in measurement. which you could get a little suggestions about partitioning too; for instance, that you would be able to see that the primary two disks have a swap area enabled, but the third one doesn't. You can also get the mountpoints (/, /disk-computing device and /disk-records) for the three disks.

    record 1. The lsblk command suggests all block (storage) contraptions. The --topology choice adds added particulars; try --output-fascinated with much more. > lsblk --paths identify MAJ:MIN RM dimension RO class MOUNTPOINT /dev/sda eight:0 0 465.8G 0 disk |__/dev/sda1 eight:1 0 4G 0 part [SWAP] |__/dev/sda2 8:2 0 461.8G 0 part / /dev/sdb 8:16 0 149.1G 0 disk |__/dev/sdb1 8:17 0 4G 0 part [SWAP] |__/dev/sdb2 8:18 0 145G 0 half /disk-desktop /dev/sdc 8:32 0 2.7T 0 disk |__/dev/sdc1 eight:33 0 2.7T 0 part /disk-statistics /dev/sr0 eleven:0 1 1024M 0 rom > lsblk --paths --topology identify ALIGNMENT MIN-IO choose-IO PHY-SEC LOG-SEC ROTA SCHED RQ-size RA WSAME sda 0 512 0 512 512 1 cfq 128 128 0B |__sda1 0 512 0 512 512 1 cfq 128 128 0B |__sda2 0 512 0 512 512 1 cfq 128 128 0B sdb 0 512 0 512 512 1 cfq 128 128 0B |__sdb1 0 512 0 512 512 1 cfq 128 128 0B |__sdb2 0 512 0 512 512 1 cfq 128 128 0B sdc 0 4096 0 4096 512 1 cfq 128 128 0B |__sdc1 0 4096 0 4096 512 1 cfq 128 128 0B sr0 0 512 0 512 512 1 cfq 128 128 0B

    there are lots of possible not obligatory arguments, however the most customarily used are --paths, which produces full device paths, and --topology, in case you have an interest in interior particulars, corresponding to physical sector dimension, I/O scheduler name and so on. that you may get proprietor, community and permissions guidance with --perm, as shown beneath (and, in case you really want unique suggestions, are attempting --output-all, so as to list about 50 columns' worth of facts):

    > lsblk --perm identify measurement proprietor community MODE sda 465.8G root disk brw-rw---- |__sda1 4G root disk brw-rw---- |__sda2 461.8G root disk brw-rw---- sdb 149.1G root disk brw-rw---- |__sdb1 4G root disk brw-rw---- |__sdb2 145G root disk brw-rw---- sdc 2.7T root disk brw-rw---- |__sdc1 2.7T root disk brw-rw---- sr0 1024M root cdrom brw-rw----

    For SCSI instruments, which you can add --scsi to lsblk, but there's additionally the more specific lsscsi command. The primary counsel it produces is shown below, and it includes all attainable SCSI devices. In my case, it indicates the three hard disks and the optical reader I already discovered with lsblk, plus three card readers. be aware that you simply additionally get more advice on selected manufacturers and models. as an instance, I have two Western Digital tough drives (WD5000AAKS and WD30EZRX), plus a Maxtor laptop power (STM316021) and a Sony advert-7200S DVD unit:

    > lsscsi [2:0:0:0] disk ATA WDC WD5000AAKS-0 1D05 /dev/sda [2:0:1:0] disk ATA MAXTOR STM316021 D /dev/sdb [3:0:0:0] disk ATA WDC WD30EZRX-00M 0A80 /dev/sdc [3:0:1:0] cd/dvd SONY DVD RW advert-7200S 1.sixty one /dev/sr0 [4:0:0:0] disk Sony Card_R/W -CF 1.11 /dev/sdd [4:0:0:1] disk Sony Card_R/W -SD 1.eleven /dev/sde [4:0:0:2] disk Sony Card_R/W -MS 1.11 /dev/sdf

    try all the probabilities of this command with lsscsi --assist. you're going to see that you really can dig down into SCSI contraptions with it. And, if you're interested, this command works by using scanning the /sys filesystem (see elements, and the DIY with /proc and /sys sidebar for more information).

    Now, let's movement on to every other commands. lsusb gives guidance on all USB-connected gadgets; see checklist 2 for an instance. (An option is usb-devices, however's somewhat more imprecise in its output and has no configuration alternatives.) As in most up to date computer systems, you will probably have loads of such instruments. In my case, I even have a Bluetooth dongle, Webcam, keyboard, mouse and more. which you can get suggestions on a particular bus or gadget with the -s choice or choose a given seller with the -d choice; for the latter, check the USB id repository (see substances) for dealer/machine numbers. eventually, if you desire very targeted information, are trying the -v (verbose) option, however be organized to study a whole lot. For my desktop, lsusb -v produces more than 1,300 traces of output.

    listing 2. The lsusb command reviews all USB-linked instruments, as a list or in tree kind. > lsusb Bus 001 device 001: id 1d6b:0002 Linux foundation ↪2.0 root hub Bus 005 device 002: id 054c:01bd Sony Corp. MRW62E ↪Multi-Card Reader/writer Bus 005 gadget 001: identification 1d6b:0001 Linux foundation 1.1 ↪root hub Bus 004 gadget 001: identification 1d6b:0001 Linux basis 1.1 ↪root hub Bus 003 device 002: id 0a12:0001 Cambridge Silicon ↪Radio, Ltd Bluetooth Dongle (HCI mode) Bus 003 equipment 006: identification 1e4e:0100 Cubeternet WebCam Bus 003 device 005: id 046d:c317 Logitech, Inc. ↪Wave Corded Keyboard Bus 003 equipment 004: identification 04f3:0232 Elan ↪Microelectronics Corp. Mouse Bus 003 gadget 003: identification 05e3:0608 Genesys common sense, ↪Inc. Hub Bus 003 device 001: id 1d6b:0001 Linux groundwork ↪1.1 root hub Bus 002 equipment 001: id 1d6b:0001 Linux groundwork ↪1.1 root hub > lsusb --tree /: Bus 05.Port 1: Dev 1, category=root_hub, Driver=uhci_hcd/2p, 12M |__ Port 2: Dev 2, If 0, category=Mass Storage, ↪Driver=usb-storage, 12M /: Bus 04.Port 1: Dev 1, classification=root_hub, Driver=uhci_hcd/2p, 12M /: Bus 03.Port 1: Dev 1, classification=root_hub, Driver=uhci_hcd/2p, 12M |__ Port 1: Dev 3, If 0, classification=Hub, Driver=hub/4p, 12M |__ Port 1: Dev 4, If 0, classification=Human Interface machine, ↪Driver=usbhid, 1.5M |__ Port 2: Dev 5, If 0, class=Human Interface equipment, ↪Driver=usbhid, 1.5M |__ Port 2: Dev 5, If 1, class=Human Interface gadget, ↪Driver=usbhid, 1.5M |__ Port 3: Dev 6, If 0, type=Video, Driver=uvcvideo, 12M |__ Port three: Dev 6, If 1, category=Video, Driver=uvcvideo, 12M |__ Port 2: Dev 2, If 0, category=wireless, Driver=btusb, 12M |__ Port 2: Dev 2, If 1, class=instant, Driver=btusb, 12M /: Bus 02.Port 1: Dev 1, category=root_hub, Driver=uhci_hcd/2p, 12M /: Bus 01.Port 1: Dev 1, classification=root_hub, Driver=ehci-pci/8p, 480M

    another command that may produce a ton of advice is lspci, which indicates all statistics on PCI instruments. And, as a be counted of reality, here is the genuine command I used to remember what form of graphics card I had:

    # lspci 00:00.0 Host bridge: Intel organisation four series ↪Chipset DRAM Controller (rev 03) 00:01.0 PCI bridge: Intel agency 4 collection ↪Chipset PCI specific Root Port (rev 03) 00:1b.0 Audio equipment: Intel service provider NM10/ICH7 ↪household high Definition Audio Controller (rev 01) 00:1c.0 PCI bridge: Intel business enterprise NM10/ICH7 ↪household PCI specific Port 1 (rev 01) 00:1c.1 PCI bridge: Intel enterprise NM10/ICH7 ↪household PCI categorical Port 2 (rev 01) 00:1d.0 USB controller: Intel service provider NM10/ICH7 ↪family unit USB UHCI Controller #1 (rev 01) 00:1d.1 USB controller: Intel business enterprise NM10/ICH7 ↪family USB UHCI Controller #2 (rev 01) 00:1d.2 USB controller: Intel supplier NM10/ICH7 ↪household USB UHCI Controller #three (rev 01) 00:1d.three USB controller: Intel service provider NM10/ICH7 ↪family unit USB UHCI Controller #four (rev 01) 00:1d.7 USB controller: Intel corporation NM10/ICH7 ↪family unit USB2 EHCI Controller (rev 01) 00:1e.0 PCI bridge: Intel enterprise 82801 PCI ↪Bridge (rev e1) 00:1f.0 ISA bridge: Intel corporation 82801GB/GR ↪(ICH7 family unit) LPC Interface Bridge (rev 01) 00:1f.1 IDE interface: Intel service provider 82801G (ICH7 ↪household) IDE Controller (rev 01) 00:1f.2 IDE interface: Intel business enterprise NM10/ICH7 ↪family unit SATA Controller [IDE mode] (rev 01) 00:1f.three SMBus: Intel employer NM10/ICH7 household ↪SMBus Controller (rev 01) 01:00.0 Ethernet controller: Qualcomm Atheros AR8152 ↪v2.0 quick Ethernet (rev c1) 04:00.0 VGA appropriate controller: NVIDIA enterprise ↪GK107 [GeForce GT 740] (rev a1) 04:00.1 Audio gadget: NVIDIA service provider GK107 HDMI ↪Audio Controller (rev a1)

    try the -v or -vv alternatives, for verbose and very verbose listings. To get full information on my (current) photos card, I proceeded as proven in record 3. I now have an NVIDIA GeForce 740, and that i'm the use of the nouveau kernel driver, among different interior particulars. Of direction, to consider the produced counsel fully, you should have somewhat of journey with PCI contraptions. are attempting the equal command with -vv, and you'll see what i am talking about.

    checklist 3. The -v alternative gives greater distinctive tips; -vv goes even deeper. # lspci -v -s four:00.0 04:00.0 VGA suitable controller: NVIDIA organization ↪GK107 [GeForce GT 740] (rev a1) ↪(prog-if 00 [VGA controller]) Subsystem: Corp. gadget 2742 Flags: bus grasp, fast devsel, latency 0, IRQ 27 memory at fd000000 (32-bit, non-prefetchable) [size=16M] reminiscence at e0000000 (sixty four-bit, prefetchable) [size=256M] reminiscence at de000000 (64-bit, prefetchable) [size=32M] I/O ports at ec00 [size=128] [virtual] growth ROM at fe000000 [disabled] [size=512K] Capabilities: [60] vigour management version three Capabilities: [68] MSI: enable+ count number=1/1 Maskable- 64bit+ Capabilities: [78] specific Endpoint, MSI 00 Capabilities: [b4] supplier specific suggestions: Len=14 Capabilities: [100] digital Channel Capabilities: [128] power Budgeting Capabilities: [600] supplier certain counsel: identification=0001 ↪Rev=1 Len=024 Capabilities: [900] #19 Kernel driver in use: nouveau Kernel modules: nouveau

    if you're even more electronically/digitally minded, lsdev produces suggestions about your put in hardware, including interrupts, ports, addresses and all such inside details. This command offers no alternatives, and or not it's no longer seemingly you are going to use it except you are dealing very carefully with hardware. list four indicates an abbreviated illustration of the output. This command scans /proc/interrupts, /proc/ioports and /proc/dma, as described in the DIY with /proc and /sys sidebar.

    listing four. The lsdev command gives suggestions on interrupts, ports and direct memory entry. > lsdev device DMA IRQ I/O Ports ------------------------------------------------ 7 0000:00:1d.0 c480-c49f 0000:00:1d.1 c800-c81f 0000:00:1d.2 c880-c89f ... ... (a couple of strains snipped out) ... eth0 29 fpu 00f0-00ff gpio_ich 0480-04bf 04b0-04bf i801_smbus 19 0400-041f i8042 1 12 iTCO_wdt 0830-0833 0830-0833 0860-087f 0860-087f keyboard 0060-0060 0064-0064 ... ... (a number of traces snipped out) ... timer 0 timer0 0040-0043 timer1 0050-0053 uhci_hcd c480-c49f c800-c81f c880-c89f cc00-cc1f uhci_hcd:usb2 23 uhci_hcd:usb3 19 uhci_hcd:usb4 18 uhci_hcd:usb5 16 vesafb 03c0-03df

    finally, in case you've made it this a ways, the lshw command is a sort of capture-all that can produce lots of advice on all of your put in hardware. The -brief alternative gives a (a little) abbreviated record of everything in your container (see listing 5, and observe some enjoyable lines, "To Be crammed by using O.E.M.", which demonstrate that someone turned into careless when setting up my motherboard). With this command, you get counsel on the system, buses, memory, processor, reveal, network and everything else.

    checklist 5. The lshw command contains information on your entire hardware." # lshw -shortH/W course gadget category Description ============================================================= gadget To Be crammed ↪through O.E.M. /0 bus G41M-VS3. /0/0 memory 64KiB BIOS /0/4 processor Core 2 Quad (To Be ↪stuffed by using O.E.M.) /0/4/5 memory 128KiB L1 cache /0/four/6 reminiscence 4MiB L2 cache /0/d reminiscence 4GiB gadget memory /0/d/0 reminiscence 4GiB DIMM SDRAM ↪Synchronous /0/d/1 reminiscence DIMM [empty] /0/a hundred bridge four series Chipset ↪DRAM Controller /0/a hundred/1 bridge 4 sequence Chipset ↪PCI specific Root Port /0/a hundred/1/0 monitor GK107 [GeForce ↪GT 740] /0/one hundred/1/0.1 multimedia GK107 HDMI Audio ↪Controller /0/100/1b multimedia NM10/ICH7 family ↪excessive Definition Audio Controller /0/one hundred/1c bridge NM10/ICH7 household ↪PCI express Port 1 /0/one hundred/1c.1 bridge NM10/ICH7 family ↪PCI specific Port 2 /0/a hundred/1c.1/0 eth0 network AR8152 v2.0 speedy ↪Ethernet /0/one hundred/1d bus NM10/ICH7 household ↪USB UHCI Controller #1 /0/one hundred/1d/1 usb2 bus UHCI Host Controller /0/a hundred/1d.1 bus NM10/ICH7 family ↪USB UHCI Controller #2 /0/a hundred/1d.1/1 usb3 bus UHCI Host Controller /0/a hundred/1d.1/1/1 bus USB2.0 Hub /0/a hundred/1d.1/1/1/1 enter OM /0/one hundred/1d.1/1/1/2 enter USB Multimedia ↪Keyboard /0/a hundred/1d.1/1/1/3 multimedia USB2.0 camera /0/100/1d.1/1/2 verbal exchange Bluetooth Dongle ↪(HCI mode) ...a couple of strains snipped out... /0/1 scsi2 storage /0/1/0.0.0 /dev/sda disk 500GB WDC ↪WD5000AAKS-0 /0/1/0.0.0/1 /dev/sda1 extent 4102MiB Linux ↪swap extent /0/1/0.0.0/2 /dev/sda2 extent 461GiB EXT4 extent /0/1/0.1.0 /dev/sdb disk 160GB MAXTOR ↪STM316021 /0/1/0.1.0/1 /dev/sdb1 volume 4094MiB Linux ↪swap volume /0/1/0.1.0/2 /dev/sdb2 extent 145GiB EXT3 volume /0/2 scsi3 storage /0/2/0.0.0 /dev/sdc disk 3TB WDC ↪WD30EZRX-00M /0/2/0.0.0/1 /dev/sdc1 quantity 2794GiB EXT4 quantity /0/2/0.1.0 /dev/cdrom disk DVD RW advert-7200S

    notice the "class" column in listing 5. you can get a hint of the whole counsel that lshw can deliver through the use of the -category parameter to restrict output. for instance, see beneath the special specs on my community card; it suggests the vendor, model and plenty of different details (warning: here is the variety of output you get if you do not prevent the command with -brief; for my desktop, lshw with out a extra alternate options produces a list more than 500 lines lengthy):

    # lshw -category community *-community description: Ethernet interface product: AR8152 v2.0 fast Ethernet seller: Qualcomm Atheros actual identity: 0 bus data: pci@0000:01:00.0 logical identify: eth0 edition: c1 serial: bc:5f:f4:12:e0:f1 size: 100Mbit/s skill: 100Mbit/s width: sixty four bits clock: 33MHz capabilities: pm msi pciexpress vpd bus_master ↪cap_list ethernet physical tp 10bt 10bt-fd ↪100bt 100bt-fd autonegotiation configuration: autonegotiation=on broadcast=sure ↪driver=atl1c driverversion= ↪duplex=full latency=0 hyperlink=sure multicast=sure ↪port=twisted pair pace=100Mbit/s elements: irq:29 reminiscence:fcfc0000-fcffffff ↪ioport:dc00(measurement=128)

    The lshw command has several other entertaining alternate options. for example, it could possibly produce either HTML or XML output (add the -html or -xml alternatives); the previous is appropriate for showing in a browser, while the latter is useful if you are looking to store or method your hardware tips. See figure 1 for only a small a part of the whole hardware description of my field. For safety purposes, the -sanitize option gets rid of delicate guidance, akin to serial numbers. there's even an -X option to make use of a graphical interface (i could get to that later).

    figure 1. The lshw command also can produce HTML or XML output; the former is shown right here.

    up to now, I've discussed a couple of ls* commands, and in spite of the fact that they don't seem to be in reality a "family", they're my favorite tools. or not it's effortless to be aware them by means of typing ls and letting classification-forward suggest the rest. youngsters, there are extra command-line chances, so let's take a look.

    What's SMBIOS?

    How does Linux appreciate what devices are put in? in view that 1995, the SMBIOS (device administration BIOS) specification has offered this type of suggestions, doing away with the want for probably worrisome operations like hardware probing. This commonplace (used via DMI) is geared to the Intel 32- and 64-bit processor architecture programs. truly, it defines a structure with acceptable facts for every variety of machine, akin to CPU, RAM, device slots and more. On principle, you might parse and decode this table with the aid of your self, however a number of of the commands proven in this article already do this job. if you are curious about the specifics of the commonplace, see the materials section.

    more Command-Line alternate options

    Let's start with some typical instructions. the primary, dmidecode, permits you to dump the desktop's DMI (or SMBIOS; see the What's SMBIOS? sidebar) in a extra readable format. If the desk is discovered, its contents are dumped list by way of checklist, comparable to this:

    # dmidecode -t 6 # dmidecode 2.12 SMBIOS 2.5 existing. deal with 0x0009, DMI category 6, 12 bytes reminiscence Module advice Socket Designation: DIMM0 financial institution Connections: 0 1 present velocity: Unknown category: DIMM SDRAM put in dimension: 4096 MB (Double-financial institution Connection) Enabled measurement: 4096 MB (Double-bank Connection) Error popularity: good enough tackle 0x000A, DMI type 6, 12 bytes reminiscence Module tips Socket Designation: DIMM1 financial institution Connections: 4 5 existing velocity: Unknown type: DIMM SDRAM put in size: no longer put in Enabled size: no longer installed Error status: good enough

    in case you don't are looking to listing the whole desk (several hundred traces in my laptop), that you could prevent the output to a selected class of entry, in response to SMBIOS definitions (desk 2).

    desk 2. SMBIOS has a few list forms so you might choose with dmidecode. classification Description 0 BIOS 1 gadget 2 Baseboard three Chassis four Processor fiveMemory Controller 6 memory Module 7 Cache 8 Port Connector 9device Slots 10 On-board devices eleven OEM Strings 12 gadget Configuration alternate options 13 BIOS Language 14 group institutions15 system experience Log sixteen actual reminiscence Array 17 reminiscence device 18 32-bit reminiscence error19 reminiscence Array Mapped tackle 20 reminiscence machine Mapped tackle 21 built-in Pointing equipment 22 transportable Battery 23 system Reset 24 Hardware security25 equipment power Controls 26 Voltage Probe 27 Cooling device 28 Temperature Probe 29 Electrical latest Probe 30 Out-of-band faraway access 31 Boot Integrity facilities32 system Boot 33 64-bit reminiscence errors34 administration device 35 administration equipment element 36 management device Threshold records 37 memory Channel 38 IPMI equipment 39 vigour provide 40 more information 41 Onboard instruments extended tips 42 management Controller Host Interface 126 Disabled entry127 "end-of-desk" particular Marker 128–255 OEM-selected records

    You also can use particular key terms to hinder the output to just a few kinds (desk 3).

    table 3. You can also use special key terms to get related assistance from SMBIOS. SMBIOS keyword SMBIOS varieties bios 0,13 device 1,12,15,23,32 baseboard 2,10,forty one chassis three processor 4 reminiscence 5,6,16,17 cache 7 connector 8 slot 9

    If I had been to supply an award for "Most Talkative Command", surely it would go to hwinfo, a further command that may dump all of the hardware assistance on your laptop. On my computing device, working hwinfo with none parameters produces greater than 12,000 strains, including a few memory dumps of the SMBIOS table. which you could produce a a good deal extra compact version with the --brief alternative (listing 6).

    listing 6. The hwinfo command can also be reasonably talkative; the use of the --brief alternative makes it more manageable. # hwinfo --shortcpu: Intel(R) Core(TM)2 Quad CPU Q8400 @ 2.66GHz, 2670 MHz Intel(R) Core(TM)2 Quad CPU Q8400 @ 2.66GHz, 2336 MHz Intel(R) Core(TM)2 Quad CPU Q8400 @ 2.66GHz, 2670 MHz Intel(R) Core(TM)2 Quad CPU Q8400 @ 2.66GHz, 2670 MHz keyboard: Logitech USB Multimedia Keyboard mouse: Elan Microelectronics OM monitor: SAMSUNG SA300/SA350 SAMSUNG S20B300 snap shots card: nVidia VGA suitable controller sound: Intel NM10/ICH7 family unit high Definition Audio Controller nVidia GK107 HDMI Audio Controller storage: Intel 82801G (ICH7 family unit) IDE Controller Intel NM10/ICH7 family unit SATA Controller [IDE mode] community: eth0 Atheros AR8152 v2.0 speedy Ethernet community interface: lo Loopback network interface eth0 Ethernet community interface disk: /dev/sda WDC WD5000AAKS-0 /dev/sdb MAXTOR STM316021 /dev/sdc WDC WD30EZRX-00M ...and so on (leisure of the listing, snipped out)

    which you can preclude hwinfo to a particular type of hardware via including an choice, reminiscent of --display screen or --printer. Get the whole record of options with hwinfo --support. as an example, i will be able to dump the optical unit statistics with hwinfo --cdrom (list 7). The --listmd alternative means that you can consist of RAID instruments, which constantly don't seem to be included in the ordinary output.

    record 7. The hwinfo command can avert its output to specific hardware, because the cdrom device, for example. # hwinfo --cdrom 25: SCSI 301.0: 10602 CD-ROM (DVD) [Created at block.249] exciting identification: KD9E.SGHalmfn+h9 parent id: w7Y8.xyd+qedQTr5 SysFS identity: /type/block/sr0 SysFS BusID: three:0:1:0 SysFS device hyperlink: /contraptions/pci0000:00/0000:00:1f.2/ ↪ata4/host3/target3:0:1/three:0:1:0 Hardware type: cdrom model: "SONY DVD RW ad-7200S" dealer: "SONY" device: "DVD RW advert-7200S" Revision: "1.sixty one" Driver: "ata_piix", "sr" Driver Modules: "ata_piix", "sr_mod" gadget File: /dev/sr0 (/dev/sg3) equipment information: /dev/sr0, /dev/cdrom, /dev/cdrw, ↪/dev/disk/with the aid of-identity/ata-Optiarc_DVD_RW_AD-7200S, ↪/dev/disk/with the aid of-direction/pci-0000:00:1f.2-ata-2.1, ↪/dev/dvd, /dev/dvdrw gadget number: block eleven:0 (char 21:3) elements: CD-R, CD-RW, DVD, DVD-R, DVD-RW, DVD-R DL, ↪DVD+R, DVD+RW, DVD+R DL, DVD-RAM, MRW, MRW-W force status: no medium Config repute: cfg=no, avail=yes, want=no, lively=unknown attached to: #14 (IDE interface) pressure velocity: 48

    Of the command-line programs i am protecting in this article, inxi is more colorful, although simplest moderately (figure 2).

    determine 2. inxi, in spite of the fact that simplest a command-line tool, as a minimum tries to use some colours.

    If invoked and not using a parameters, it is going to simply produce a line like here, showing CPU, kernel, uptime and just a few extra details:

    CPU~Quad core Intel Core2 Quad CPU Q8400 (-MCP-) ↪clocked at 2003.000 Mhz Kernel~4.1.5-1-computer ↪x86_64 Up~2 days 23:24 Mem~2377.four/3949.4MB ↪HDD~3660.7GB(67.9% used) Procs~202 customer~Shell ↪inxi~1.7.24

    despite the fact, that you would be able to use a lot of options to get specific records. as an example, that you may set the verbosity level with alternatives -v0 (minimal) via -v7 (highest verbosity). The -x alternative makes it possible for together with extra assistance for some hardware. take a look at inxi -h to get all possible options. for instance, which you can get audio assistance with inxi -A or pix card facts with inxi -G and the like:

    # inxi -A Audio: Card-1: NVIDIA GK107 HDMI Audio Controller ↪driver: snd_hda_intel Sound: ALSA ver: k4.1.5-1-computing device Card-2: Intel NM10/ICH7 family unit high ↪Definition Audio Controller driver: snd_hda_intel

    Now, let's conclude with some GUI options.

    The GUI approach

    To beginning with, usbview is a rough picture equivalent of lsusb or usb-contraptions, which I discussed earlier. or not it's reasonably elementary to make use of, with out a alternate options or parameters. It shows two columns: the left one is a tree of all attainable USB instruments, and the appropriate one gives the full particulars. determine three suggests details on my USB keyboard.

    determine 3. The usbview command shows the particulars of all USB devices in tree form.

    Let's movement on to a command I already discussed, which shares the reveal style: lshw -X. in its place of manufacturing a catalogue (as shown prior to now), the -X alternative produces a picture interface with a number of columns on the left to mean you can choose what hardware to check up on. a neighborhood to the correct shows the complete hardware details for the chosen gadget. determine four indicates the result of analyzing my optical DVD reader/creator unit; the supplied tips contains other particulars, such as the logical unit identify, its capabilities and extra.

    figure four. The lswh -X command produces a graphic interface that means that you can browse all hardware gadgets.

    another interesting application is hardinfo, which "is not useless, however wants a maintainer", in line with its GitHub web page (see components.) This software suggests a tree constitution to the left with 4 main branches:

  • computing device suggests lots of details about your machine: some are involving application and never to hardware.

  • instruments contains all contraptions in your container, grouped through category.

  • network no longer best shows community card details, however also another facets, equivalent to DNS servers or routing.

  • Benchmarks lets you see how your laptop fares in opposition t different computers, but as a result of the inability of updates, the comparisons are towards historic CPUs.

  • figure 5 shows sample output.

    determine 5. The hardinfo command comprises a few further pieces of statistics, not restrained only to hardware.

    There are two greater options. The "suggestions" menu entry means that you can produce a document, in both HTML or plain-text layout, making a choice on whichever elements pastime you. The "network Updater" should allow you to update the inner application information, together with extra contemporary benchmark results, however when i tried to run it, I acquired a "Contacting HardInfo primary Database (failed)" message. See determine 6 for a illustration of the produced HTML report.

    determine 6. The hardinfo command can produce an HTML or text file describing your comprehensive system.

    Let's end with KDE's own kinfocenter. This utility (see figure 7, which shows RAM details for my computer) is comparable to the old tools i've been describing, and it offers a left pane with a tree with all purchasable alternatives and a correct pane with greater details on the chosen choice on the left.

    determine 7. KDE's personal kinfocenter suggests not best hardware details, however plenty of other device facts as neatly.

    The software doesn't avert itself to hardware particulars, but shows all types of different tips, reminiscent of "Samba popularity", "power assistance" or "X-Server", just to point out just a few.

    DIY with /proc and /sys

    Linux is filled with directories and data, however the /proc and /sys directories are definitely odd. They do not truly exist, but they will let you browse them. they are filled with zero-length empty data, but which you can open and view them. The /proc directory preceded /sys, and it has really all details about running methods (hence, the /proc name). Over time, more info had been added to it, primarily "digital" ones, which do not in fact exist, but are created on the fly if you try to access them. (Most virtual files sport a existing timestamp, which suggests that they continuously are stored up so far and their contents are the latest viable.) The /sys listing is more modern. It appeared across the time of the 2.6 kernel to introduce greater order and a better constitution than supplied with the aid of the older /proc, which had just grown in a sort of haphazard approach. most of the files (but no longer all) in /proc are duplicated in /sys, and every time viable, you'll want to decide on the information within the latter listing. The /sys listing has several subdirectories:

  • block/ has an entry pointing to each and every block gadget.

  • bus/ has directories for every bus class, and within each, two subdirectories: devices/ and drivers/. the previous has a listing for each machine, pointing to the gadget's listing in /root, and the latter has a listing for each and every driver that turned into loaded for instruments on the given bus.

  • class/ has directories for each and every category of object; some examples are block/, snap shots/, internet/, sound/ and so forth.

  • dev/ offers directories for each classification of device (for instance, dev/block/ or dev/char/), every with directories for every applicable machine.

  • contraptions/ includes the international equipment hierarchy, with every actual equipment on your device.

  • firmware/ contains directories for firmware-particular objects; for instance, acpi/ or memmap/, however the certain directories to your own computing device rely on the firmware drivers to your kernel.

  • fs/ has a directory for every filesystem class to your desktop, each and every with extra directories for every particular gadget; as an example, I even have /sys/fs/ext4/sda2, because the disk installed as /dev/sda2 uses ext4.

  • kernel/ has a number of data involving the at present loaded kernel.

  • module/ has a subdirectory for each and every module loaded into the kernel.

  • vigour/ represents the vigor subsystem.

  • if you happen to get to the deepest ranges of any department, you may find any number of particular person info, which which you could examine to get attributes of the given object. What info? this is a hard query to answer, since it is dependent upon which specific department you are touring, so that you'll ought to do just a little of labor earlier than you get to extract tips from the /sys listing. (See supplies for some pointers about this.) additionally, be conscious that you should write to one of the vital files, and so that you can indicate editing the corresponding parameter—be warned: try this with care! although, in case you preserve at it, you'll be able to reproduction the functionality of many of the tools proven in this article, which frequently work the same way.


    I've covered a lot of instructions that assist you to question your Linux laptop and be trained, in more or less detail, what's exactly in it. And if you need to, you even can get on the base records through yourself and whip up your own hardware-inspection device.


    study concerning the SMBIOS general at on the time of this writing, the latest version is three.0.0, dated 2/15/2015.

    which you can find counsel on sysfs at and extra certain documentation at

    related to the older procfs, assess

    The USB identification repository at has the complete record of all known IDs used in USB devices.

    The PCI id repository at offers a centralized list of PCI device IDs.

    The lscpu and lsblk commands are part of the util-linux package, attainable at For documentation, take a look at http://linux.die.internet/man/1/lscpu and http://linux.die.internet/man/8/lsblk, respectively.

    examine lsscsi options at and find a manual page at http://linux.die.internet/man/eight/lsscsi.

    For the lsdev man page, see http://linux.die.internet/man/8/lsdev.

    The lshw domestic web page is at, and its manual web page is at http://linux.die.internet/man/1/lshw.

    See lsusb within the "usbutils" web page at, and get more counsel at

    that you could find lspci at (domestic of the "PCI Utilities") and the man page at .

    take a look at usbview at and its man web page at http://linux.die.internet/man/eight/usbview.

    The hardinfo source repository is at, but first examine your distribution's repositories; or not it's likely to already be there. notice that the program's last update become greater than two years in the past, and no extra protection has been achieved.

    that you can locate KInfoCenter at

    Deploying CLR Assemblies with T-SQL | Real Questions and Pass4sure dumps

    Microsoft introduced the ability to use .net CLR saved approaches and features in SQL Server some time ago, beginning with SQL Server 2005. Now greater than 8 years later I feel many builders are like me: I well known the power of CLR routines, however are attempting to stay away from the use of CLR.

    a part of the explanation for this avoidance has to do with technical issues. however honestly for me, part of the reason additionally has to do with the multiplied complexity that CLR introduces into construction, deployment, and renovation of the database.

    this text will demonstrate an approach to deploying and managing CLR routines that may well be more relaxed for T-SQL developers and DBA's, and one that doesn't contain use of visible Studio. This strategy additionally encapsulates every thing necessary to set up the CLR meeting inside the database, that means that a database backup will store all mandatory dependencies.

    The simple aim of this exercise is to create a saved method that once done will bring together C# code, signal the .DLL, register the meeting in SQL, and create the wrapper SQL objects, all within this stored method. in this method, deployment of the CLR assembly is as effortless as operating a kept manner. every thing is sorted, and is multi function location: no impartial .DLL 's, visual Studio initiatives, or C# supply to keep track of.

    moreover, this pastime makes an attempt to comply with ideal practices for deployment, comparable to signing the assembly and appropriately securing it in SQL. These are things that regularly get unnoticed when in a hurry to installation a CLR assembly in SQL.


    for those who simply wish to skip to the closing outcomes: I even have created a kept method to install a sample assembly as follows:

    CREATE technique dbo.spExample_RegisterAssembly_PDFCLR AS begin DECLARE @FilePath varchar(1024) SET @FilePath = 'c:\ServerEnvironment\' CREATE table #References (AssemblyName sysname, FQFileName varchar(1024)) INSERT INTO #References (AssemblyName, FQFileName) VALUES ('equipment.Drawing', 'C:\home windows\Microsoft.internet\Framework\v2.0.50727\equipment.Drawing.dll') INSERT INTO #References (AssemblyName, FQFileName) VALUES ('itextsharp', @FilePath + 'itextsharp.dll') DECLARE @DropWrapperSQL varchar(MAX) SET @DropWrapperSQL = ' IF OBJECT_ID(''dbo.udfRenderPDF'') isn't NULL start DROP feature dbo.udfRenderPDF; end ' DECLARE @CreateWrapperSQL varchar(MAX) SET @CreateWrapperSQL = ' CREATE function [dbo].[udfRenderPDF]( @TemplatePDF varbinary(MAX), @FieldsXML xml ) RETURNS [varbinary](max) WITH EXECUTE AS CALLER AS exterior name [PDFCLR].[Functions].[RenderPDF] ' --C# supply Code. --Paste CLR supply in below. exchange all occurrences a single quote with two single charges. DECLARE @SourceCode nvarchar(MAX) SET @SourceCode = ' //------beginning of CLR supply------ the usage of device; the use of system.information; using gadget.statistics.SqlClient; the usage of gadget.statistics.SqlTypes; the use of Microsoft.SqlServer.Server; ....relaxation of C# source code goes here //------conclusion of CLR supply------ ' EXEC dbo.spsysBuildCLRAssembly @AssemblyName = 'PDFCLR', @FileName = 'PDFCLR_SQLCLR.cs', @FilePath = @FilePath, @DropWrapperSQL = @DropWrapperSQL, @CreateWrapperSQL = @CreateWrapperSQL, @SourceCode = @SourceCode conclusion

    Calling this spExample_RegisterAssembly_PDFCLR process will construct the C# supply code provided in @SourceCode, and may sign the .DLL, register all referenced assemblies, create an asymmetric key and linked login for each assembly, and create the assembly in SQL, along with wrapper techniques. (See the connected files to down load the required routines.)

    in this method, executing your stored system will do everything mandatory to construct and set up this CLR meeting–even if you restoration your database to a distinct server.

    There isn't any should use visible Studio, or to entry any external info: every little thing is encapsulated on your database, and can be run from a simple T-SQL kept method.


    here is a step-by using-step list of the work this stored technique should do:

  • Retrieve C# supply from SQL
  • Write C# source to a temporary .cs file
  • enable CLR aid within the database (if critical)
  • temporarily allow xp_cmdshell (just at some stage in execution of this system)
  • Write and execute a batch file that does right here:
  • Generate signature with the command line "sn" (powerful identify) tool
  • construct C# source into a signed DLL the usage of the command line "csc" (C Sharp Compiler) compiler
  • Disable xp_cmdshell (for safety motives)
  • Drop the SQL wrapper feature that wraps the CLR formula (if it exists)
  • Drop the CLR meeting (if it exists)
  • Create key to cozy the assembly:
  • Create an asymmetric key (losing if it exists)
  • Create a SQL login from the key (shedding if it exists)
  • grant rights to the login
  • Create the meeting in SQL
  • Create the SQL wrapper feature that wraps the CLR formula
  • As effortless as 1, 2, three…eleven. and that is part of what I imply concerning the complexity of deploying and keeping CLR assemblies in SQL: there are loads of steps to learn the way to do (and then be aware to do them). These steps deserve to be done anytime you deploy this database to a new server. Being capable of do all of those issues via executing a single kept technique simplifies issues vastly.

    (observe that the sequence of some of these steps has been altered a bit in the ultimate version of the code it is attached.)

    Step 1: Retrieve C# supply from SQL

    We try to stay away from storing the C# source in a file because they want everything that is needed to create the meeting to be encapsulated within the database. The supply may well be kept in a desk, or, as I even have performed here, the supply code will also be stored as a string literal inner the stored manner.

    What I have carried out is replica-and-pasted the C# supply from visible Studio, then used search-and-replace to substitute single quote characters with two single quote characters, and then assigned this string to a variable that will later get written out to a short lived .cs file.


    DECLARE @CLRSource nvarchar(MAX) SET @CLRSource = ' //------birth of CLR supply------ the usage of gadget; the use of system.records; using gadget.records.SqlClient; using device.information.SqlTypes; the use of Microsoft.SqlServer.Server; .... //------conclusion of CLR source------ '

    Step 2: Write C# source to a brief .cs file

    Phil component writes some usefull SQL code. considered one of his articles gives us a utility technique they will use to readily write a string to a file. i use this process to put in writing the C# supply to a .cs file.


    EXEC dbo.sputilWriteStringToFile @FileData = @CLRSource, @FilePath = @FilePath, @FileName = @FileName Step 3: permit CLR guide

    This technique will set up a CLR meeting. definitely they need CLR support enabled within the database.


    IF no longer EXISTS(choose * FROM sys.configurations the place identify = 'clr enabled') start SET @SQL =' EXEC grasp.dbo.sp_configure ''demonstrate superior alternatives'', 1 RECONFIGURE EXEC master.dbo.sp_configure ''clr'', 1 RECONFIGURE' EXEC(@SQL) end Step four: quickly permit xp_cmdshell

    I utterly take into account and agree that xp_cmdshell can introduce a number of protection issues, and is most excellent averted in production databases. My method here is that this saved procedure will allow xp_cmdshell quickly. It can be enabled just long satisfactory to call a batch file that the manner will dynamically create.

    personally, this use of xp_cmdshell is secure and applicable: it'll only be referred to as at installation-time by way of an administrator, might be used to execute cautiously scripted statements, and may be instantly disabled.


    SET @SQL = ' EXEC grasp.dbo.sp_configure ''show advanced alternate options'', 1 RECONFIGURE EXEC master.dbo.sp_configure ''xp_cmdshell'', 1 RECONFIGURE' EXEC(@SQL)

    Step 5: Create a batch file that may be carried out

    We should execute the amazing name command line utility (sn.exe), and also the command line C# compiler (csc.exe)


    This CLR assembly requires iTextSharp, an open supply library for growing PDF 's (from ). download, and copy the itextsharp.dll file to c:\ServerEnvironment (or a folder of your opting for, updating the script as necessary).


    The sn.exe and csc.exe utilities are a part of the "home windows SDK for windows Server 2008 and .internet Framework three.5 ", available as a free down load at


    SQL Server 2005 and 2008 CLR assist is restricted to .net Framework three.5. SQL Server 2012 introduces guide for .web Framework 4.0, however can run .net Framework 3.5. This technique uses .internet Framework 3.5–which is their simplest choice on SQL 2005, 2008, and 2008 R2.

    deciding all of the command line parameters vital took slightly of analysis, but now that is accomplished the method can immediately output the obligatory parameters to the batch file.


    DECLARE @Command varchar(2048) SET @Command = "C:\application data\Microsoft SDKs\windows\v6.1\Bin\sn" -k ' + @FilePath + '\' + 'PDFCLR_keyPair.snk' + @CRLF + '"C:\windows\\Framework\v3.5\csc" /t:library' + ' /reference:c:\ServerEnvironment\itextsharp.dll' + ' /out:' + @FilePath + '\' + replace(@FileName, '.cs', '.dll') + ' /keyfile:' + @FilePath + '\' + 'PDFCLR_keyPair.snk' + ' ' + @FilePath + '\' + @FileName EXEC dbo.sputilWriteStringToFile @FileData = @Command, @FilePath = @FilePath, @FileName = 'tmp.bat'

    Step 6: Disable xp_cmdshell

    We don 't need to go away xp_cmdshell enabled, and the manner is done with it.


    SET @SQL = ' EXEC grasp.dbo.sp_configure ''show superior alternate options'', 1 RECONFIGURE EXEC master.dbo.sp_configure ''xp_cmdshell'', 0 RECONFIGURE' EXEC(@SQL)

    Step 7: Drop the wrapper SQL feature

    CLR assemblies expose strategies, but SQL requires a SQL feature it is tied to the components in the assembly. in view that they wish to drop the assembly if it exists, they have to first drop the wrapper feature.


    IF OBJECT_ID('dbo.udfRenderPDF') isn't NULL begin IF @Debug = 1 PRINT '***shedding current feature' SET @SQL = 'DROP function dbo.udfRenderPDF' EXEC(@SQL) conclusion Step eight: Drop the existing CLR assembly, if it exists

    We need to change the latest meeting (if any), so they ought to drop it if it exists.


    IF ASSEMBLYPROPERTY ('PDFCLR' , 'MvID') isn't NULL begin IF @Debug = 1 PRINT '***dropping present CLR assembly' SET @SQL = 'DROP meeting PDFCLR' EXEC(@SQL) end Step 9: Create key to comfortable the assembly

    here is one of the crucial harder elements to consider, however an in depth explanation is beyond the scope of this article. I 'll try to give a quick overview:

    CLR code can do anything else, together with harmful or malicious issues. CLR code that does probably unhealthy issues (reminiscent of deleting files from the file system) gets flagged as "dangerous ". SQL prevents "dangerous " CLR assemblies from being loaded with a view to protect the server ambiance from damaging or malicious things. SQL will permit "dangerous " CLR assemblies if one among two issues is true: a) the trustworthy database property is enabled, or b) the meeting is signed and tied to a key and login in SQL.

    trustworthy is a bad concept, because really it says that ANY "dangerous " assembly can be loaded. They don 't need to open the door to load any and each "dangerous " assembly. If they did, a person might register unhealthy or malicious .DLLs devoid of the DBA's advantage. additionally, a person could doubtlessly alternate the .DLL in the file gadget devoid of the DBA's capabilities–and SQL would then proceed to permit clients to call methods in the now-rogue assembly. (consider of devoted as being SQL deeming the complete physical server and every little thing on it as being secure or "trustworthy".)

    Signing the meeting is a much better thought. it is a little bit complex to do, but the conception isn 't too difficult. This comprises signing the assembly with a cryptographic signature, creating an asynchronous key in SQL based on this signature, making a SQL login linked to the important thing, and granting applicable rights to this login. This in impact offers us the ability to claim that the particular consumer is allowed to load this certain "hazardous " assembly.

    placing this an extra means, signing the meeting ensures the DBA that simplest assemblies authorized by means of the DBA might be used via SQL. i cannot dig into what all is happening in the mechanics of signing the code, but will instead simply display you a way to do it.


    SET @SQL = ' USE grasp; IF EXISTS(select * FROM sys.syslogins the place identify = ''PDFCLR_SQLCLRLogin'') DROP LOGIN PDFCLR_SQLCLRLogin IF EXISTS(opt for * FROM sys.asymmetric_keys where name =''PDFCLR_SQLCLRKey '') DROP uneven KEY PDFCLR_SQLCLRKey CREATE uneven KEY PDFCLR_SQLCLRKey FROM EXECUTABLE FILE = ''' + @FilePath + '\' + replace(@FileName, '.cs', '.dll') + ''' CREATE LOGIN PDFCLR_SQLCLRLogin FROM asymmetric KEY PDFCLR_SQLCLRKey provide exterior access meeting TO PDFCLR_SQLCLRLogin' EXEC(@SQL)

    Step 10: Create the meeting in SQL

    Now they can create the assembly in SQL. This terminology can also be a bit puzzling, because the .net assembly is already created (i.e. the C# code has already been compiled and the .DLL already exists.) really what they are doing right here is "registering " the assembly for use by means of SQL, although the SQL command is "CREATE assembly ".


    SET @SQL = ' CREATE assembly PDFCLR FROM ''' + @FilePath + '\' + replace(@FileName, '.cs', '.dll') + ''' WITH PERMISSION_SET = unsafe' EXEC(@SQL)

    note: This selected assembly (that renders PDF documents) requires "unsafe " operations. Some assemblies may also now not require "hazardous " operations, and can accordingly have a special surroundings for PERMISSION_SET.

    Step 11: Create the SQL wrapper characteristic

    finally they can create the SQL wrapper characteristic linked to the components in the CLR assembly. Parameters and types in the SQL wrapper should exactly in shape these in the C# code.


    SET @SQL = ' CREATE function [dbo].[udfRenderPDF]( @TemplatePDF varbinary(MAX), @FieldsXML xml ) RETURNS [varbinary](max) WITH EXECUTE AS CALLER AS external identify [PDFCLR].[Functions].[RenderPDF]' EXEC(@SQL)

    attempting it out

    ultimately, they will are attempting out the results of all their tough work by using executing the new characteristic they simply created, and seeing the way it renders a PDF file.

    (Little is declared here of what this meeting truly does or how to use it. Say tuned for the next day's article ""Rendering PDFs Natively in SQL" for particulars on this selected assembly.)


    DECLARE @FieldsXML xml SET @FieldsXML = forged( '<Fields> <box> <TextValue>hiya World</TextValue> <XPos>100</XPos> <YPos>seven hundred</YPos> <FontSize>18</FontSize> </box> <container> <TextValue>another line, just for enjoyable.</TextValue> <XPos>one hundred fifty</XPos> <YPos>650</YPos> <FontSize>12</FontSize> </container> </Fields>' AS xml) DECLARE @PDFTemplate varbinary(MAX) SET @PDFTemplate = NULL DECLARE @ResultPDF varbinary(MAX) SET @ResultPDF = dbo.udfRenderPDF(@PDFTemplate, @FieldsXML) /*The PDF file now exists in the @ResultPDF variable. that you may do something you desire with the records. to write down the binary records to a file on the server so that you can open it in Adobe Acrobat Reader that you could use this utility technique (see attached). */ EXEC [dbo].[sputilWriteBinaryToFile] @FileData = @ResultPDF, @FilePath = 'C:\Temp', @Filename = 'verify.pdf' abstract

    There are lots of steps concerned in thoroughly deploying a CLR meeting in SQL. however the decent information is that once these steps are encapsulated inside a stored method, the technique can be performed any time the CLR source code is updated and every time you need to installation the CLR meeting to a distinct machine.

    both the C# source and the script to construct, sign and register it are resident in the SQL database–and as such get backed up and restored along with all other SQL objects. The DBA can see precisely what's going on in the assembly–both in terms of the C# supply and the numerous compiler options–all in one region, through without problems looking on the source of this saved technique. also, the DBA doesn't even deserve to open visual Studio: everything can also be achieved from native T-SQL

    visual Studio is a exceptional building tool, and is beneficial when establishing the C# code. however in my view, a deployment script applied in a SQL stored process is a tons nicer approach for a T-SQL developer or DBA to installation and replace CLR assemblies in SQL.

    Will i take advantage of CLR for everything? No, truly no longer. but now once I need to use a CLR meeting i will now achieve this with more desirable defense and greater ease than I may devoid of the suggestions described listed here.

    (See attached file for full source code.  that you may down load and execute BuildAndRegisterCLRAssembly.sql to create all approaches and functions referenced right here, in addition to to execute the illustration shown above.)

    Whilst it is very hard task to choose reliable exam questions / answers resources regarding review, reputation and validity because people get ripoff due to choosing incorrect service. Killexams. com make it certain to provide its clients far better to their resources with respect to exam dumps update and validity. Most of other peoples ripoff report complaint clients come to us for the brain dumps and pass their exams enjoyably and easily. They never compromise on their review, reputation and quality because killexams review, killexams reputation and killexams client self confidence is important to all of us. Specially they manage review, reputation, ripoff report complaint, trust, validity, report and scam. If perhaps you see any bogus report posted by their competitor with the name killexams ripoff report complaint internet, ripoff report, scam, complaint or something like this, just keep in mind that there are always bad people damaging reputation of good services due to their benefits. There are a large number of satisfied customers that pass their exams using brain dumps, killexams PDF questions, killexams practice questions, killexams exam simulator. Visit, their test questions and sample brain dumps, their exam simulator and you will definitely know that is the best brain dumps site.

    Back to Braindumps Menu

    1Z0-105 practice test | EE0-503 dumps | PD0-001 free pdf | A2160-667 cram | 000-652 practice exam | C9020-568 test prep | PPM-001 study guide | C2180-274 study guide | 050-653 dumps questions | 310-303 exam prep | 210-451 pdf download | 000-883 test prep | CSQE mock exam | EE0-011 braindumps | 98-365 test questions | 117-101 sample test | MB5-292 Practice test | E20-555 bootcamp | 650-153 free pdf download | 1Z0-527 practice questions |

    Look at these I10-001 real question and answers
    We are doing fight to giving you actual XML Master Basic V2 exam questions and answers, close by clarifications. Each on has been affirmed by XML-Master confirmed authorities. They are extraordinarily qualified and affirmed individuals, who have various occasions of master encounter related to the XML-Master exams.

    At, they offer thoroughly reviewed XML-Master I10-001 actually equal Questions and Answers that are just required for Passing I10-001 exam, and to get certified with the help of I10-001 braindumps. They virtually assist people improve their understanding to memorize the and certify. It is an excellent preference to boost up your profession as a professional in the Industry. Click proud of their recognition of helping people pass the I10-001 exam of their first actual attempts. Their achievement fees in the beyond years were virtually astonishing, way to their glad customers who now able to boost their career in the fast lane. is the primary choice amongst IT specialists, in particular the ones who are trying to climb up the hierarchy levels faster of their respective businesses. Huge Discount Coupons and Promo Codes are as under;
    WC2017 : 60% Discount Coupon for all exams on internet site
    PROF17 : 10% Discount Coupon for Orders extra than $69
    DEAL17 : 15% Discount Coupon for Orders extra than $ninety nine
    DECSPECIAL : 10% Special Discount Coupon for All Orders

    The only way to get success in the XML-Master I10-001 exam is that you should obtain reliable preparatory materials. They guarantee that is the most direct pathway towards Implementing XML-Master XML Master Basic V2 certificate. You will be victorious with full confidence. You can view free questions at before you buy the I10-001 exam products. Their simulated tests are in multiple-choice the same as the real exam pattern. The questions and answers created by the certified professionals. They provide you with the experience of taking the real test. 100% guarantee to pass the I10-001 actual test. XML-Master Certification study guides are setup by IT professionals. Lots of students have been complaining that there are too many questions in so many practice exams and study guides, and they are just tired to afford any more. Seeing experts work out this comprehensive version while still guarantee that all the knowledge is covered after deep research and analysis. Everything is to make convenience for candidates on their road to certification.

    We have Tested and Approved I10-001 Exams. provides the most accurate and latest IT exam materials which almost contain all knowledge points. With the aid of their I10-001 study materials, you dont need to waste your time on reading bulk of reference books and just need to spend 10-20 hours to master their I10-001 real questions and answers. And they provide you with PDF Version & Software Version exam questions and answers. For Software Version materials, Its offered to give the candidates simulate the XML-Master I10-001 exam in a real environment.

    We provide free update. Within validity period, if I10-001 exam materials that you have purchased updated, they will inform you by email to download latest version of . If you dont pass your XML-Master XML Master Basic V2 exam, They will give you full refund. You need to send the scanned copy of your I10-001 exam report card to us. After confirming, they will quickly give you FULL REFUND. Huge Discount Coupons and Promo Codes are as under;
    WC2017 : 60% Discount Coupon for all exams on website
    PROF17 : 10% Discount Coupon for Orders greater than $69
    DEAL17 : 15% Discount Coupon for Orders greater than $99
    DECSPECIAL : 10% Special Discount Coupon for All Orders

    If you prepare for the XML-Master I10-001 exam using their testing engine. It is easy to succeed for all certifications in the first attempt. You dont have to deal with all dumps or any free torrent / rapidshare all stuff. They offer free demo of each IT Certification Dumps. You can check out the interface, question quality and usability of their practice exams before you decide to buy.

    I10-001 | I10-001 | I10-001 | I10-001 | I10-001 | I10-001

    Killexams 6002-1 Practice Test | Killexams C2090-011 cheat sheets | Killexams 642-995 practice exam | Killexams 3303 practice questions | Killexams ICBB free pdf | Killexams HC-711 brain dumps | Killexams 000-120 real questions | Killexams HP0-T21 questions and answers | Killexams LE0-628 VCE | Killexams 70-467 braindumps | Killexams 212-065 dumps questions | Killexams 77-886 brain dumps | Killexams C9520-911 practice test | Killexams PCNSE7 test prep | Killexams 412-79v8 dumps | Killexams 000-M08 questions and answers | Killexams HP2-H35 examcollection | Killexams HP2-N42 sample test | Killexams HP0-Y11 exam questions | Killexams 9A0-096 study guide | huge List of Exam Braindumps

    View Complete list of Brain dumps

    Killexams C2150-620 examcollection | Killexams EX0-004 test prep | Killexams 251-312 braindumps | Killexams 000-N11 free pdf | Killexams 350-026 pdf download | Killexams 000-051 practice questions | Killexams COG-300 real questions | Killexams 650-298 real questions | Killexams 1T6-510 cram | Killexams A6040-752 cheat sheets | Killexams 000-252 brain dumps | Killexams A2040-403 free pdf | Killexams HP0-M14 dumps | Killexams C4040-124 free pdf download | Killexams AEPA practice test | Killexams MB5-229 exam questions | Killexams HP5-H04D questions and answers | Killexams 1Y0-A01 study guide | Killexams NS0-102 sample test | Killexams 000-219 Practice Test |

    XML Master Basic V2

    Pass 4 sure I10-001 dumps | I10-001 real questions |

    How to Use PHPbrew and VirtPHP | real questions and Pass4sure dumps

    We’ve all been in the situation where they have one version installed. Maybe that version is whatever came installed on their operating system. Maybe it is a version bundled into MAMP/WAMP/XAMPP.

    How do you go about switching that PHP version?

    How do you switch to one version, then switch back again?

    How do you go about switching that version of PHP, but only for one single application on your computer?

    The Ruby and Python communities have had tools for dealing with this for years. PHP has them now too, but there was nowhere near enough fanfare.


    PHPbrew is an amazing little tool which has been out since 2012.

    It builds and installs PHP versions, and places them into a folder in your home directory. This means you can manage your own versions of PHP. PHPbrew will build various versions, place them in the home folder, and let you switch between them whenever you want.

    Installing PHPbrew

    It should be worth noting that PHPbrew has a fair few requirements, but they are not tough to install. I did not have to install anything, as after using this Macbook for over two years I had all the requirements anyway.

    If you are a Mac OS X user – and I will continue to assume that you are – then you can use Homebrew (no relation) to install dependencies.

    brew install automake autoconf curl pcre re2c mhash libtool icu4c gettext jpeg libxml2 mcrypt gmp libevent brew link icu4c

    Then you will need to install PHPbrew itself:

    curl -L -O chmod +x phpbrew sudo mv phpbrew /usr/bin/phpbrew

    This downloads PHPbrew, adds the “executable” permission and moves it to /usr/bin directory.

    Hop over to their basic usage instructions to see how to get things initialized in more detail, but the basics should just be:

    phpbrew init

    With phpbrew initialised you will need to add these lines to your .bashrc:

    echo "source $HOME/.phpbrew/bashrc" >> ~/.bashrc

    If you are using a non-default shell like ZSH then you will need to edit your .zshrc file instead.

    Installing PHP using PHPbrew

    Before they can install a version of PHP, they need to see which versions are available to PHPbrew. They can do this with a simple command:

    phpbrew known Available stable versions: 5.6+ 5.6.0 5.5+ 5.5.17, 5.5.16, 5.5.15, 5.5.14, 5.5.13, 5.5.12, 5.5.11, 5.5.10 5.4+ 5.4.33, 5.4.32, 5.4.31, 5.4.30, 5.4.29, 5.4.28, 5.4.27, 5.4.26 5.3+ 5.3.29, 5.3.28, 5.3.27, 5.3.26, 5.3.25, 5.3.24, 5.3.23, 5.3.22

    At the time of writing, PHP 5.6.0 is the latest version, and versions of PHP before 5.3 are not supported.

    We want to install PHP 5.6.0, so they can use all the great new features, so lets ask phpbrew to do that:

    phpbrew install 5.6.0

    Note that if you are using PHPbrew 1.14 or earlier then this would fail on some systems with an error about not having XML enabled. When XML is missing, PHPbrew will fail to install something called PEAR and the build will break. They can get around that using the +xml_all option:

    phpbrew install 5.6.0 +xml_all

    This +xml_all option is what PHPbrew calls “Variants”, and there are a lot more available.


    When installing PHP yourself, there are lots of options to enable or disable features. PHPbrew simplifies this and abstracts it, using a feature called Variants.

    Things like database drivers, curl, the GD image library and JSON are all available as optional variants.

    PHPbrew has one variant called “default”, which – contrary to expectation – is not used by default. Instead it acts as a shortcut for enabling the following variants:

  • bcmath
  • bz2
  • calendar
  • cli
  • ctype
  • dom
  • fileinfo
  • filter
  • ipc
  • json
  • mbregex
  • mbstring
  • mhash
  • pcntl
  • pdo
  • posix
  • readline,
  • sockets
  • xml_all
  • zip
  • The default may contain more than you need, so a more granular approach may be more to your liking.

    Lets say they just want to install PHP 5.6.0 to build a CLI application, that uses PDO to talk to a SQLite database. For that, they can do the following:

    phpbrew install 5.6.0 +cli +pdo +sqlite +xml_all

    This command will enable the PDO extension itself, and sqlite enables the SQLite drivers. The cli variant will install the command-line interface, and xml_all will stop PHPbrew complaining about PEAR.

    If you have any trouble installing a version of PHP, try running the same command but add the -d option. This will send debug information to the console, instead of sending it to a log file.

    phpbrew install -d 5.6.0 +default +sqlite Switching PHP versions

    So, at this point they should have a version of PHP installed.

    If their installation was a success then PHPbrew will output a message like this:

    Congratulations! Now you have PHP with php-5.6.0. To use the newly built PHP, try the line(s) below: $ phpbrew use php-5.6.0 Or you can use switch command to switch your default php version to php-5.6.0: $ phpbrew switch php-5.6.0

    The first command listed use will let you use PHP 5.6.0 while you’re in that console session. If you close the tab/window or restart your computer then you’ll be back to whichever version of PHP is the default.

    The second command switch will switch the default version of PHP that PHPbrew will go to on a new session.

    Lets try setting the default version to be PHP 5.6.0, and see if it works.

    $ phpbrew switch php-5.6.0 $ php -v PHP 5.6.0 (cli) (built: Sep 30 2014 15:30:22) Copyright (c) 1997-2014 The PHP Group Zend Engine v2.6.0, Copyright (c) 1998-2014 Zend Technologies

    The output above shows us exactly what they want to see: PHP 5.6.0.

    If they now try installing the older PHP 5.5, they can once again use $ phpbrew known to see which versions are available. Pick a version, and try to install it:

    phpbrew install 5.5.17 +default +sqlite

    This will install PHP 5.5.17 with the default and sqlite variants. To then use PHP 5.5.17, they have to run another command:

    $ phpbrew use php-5.5.17 $ php -v PHP 5.5.17 (cli) (built: Sep 30 2014 17:41:05) Copyright (c) 1997-2014 The PHP Group Zend Engine v2.5.0, Copyright (c) 1998-2014 Zend Technologies

    Now they can use PHP 5.5.17 in this session, and use PHP 5.6.0 again. As they used the use command not switch, when they open another tab or window, PHP 5.6.0 will once again be there.

    PHPbrew vrs. System

    When they are using a PHPbrew version of PHP, their bash session will be using a special path for the PHP version. They can find out which version is in use with the which command:

    $ which php /Users/phil/.phpbrew/php/php-5.6.0/bin/php

    If they would like to stop using this phpbrew version of PHP and go back to the system version, they can use the off command.

    $ phpbrew off phpbrew is turned off. $ which php /usr/bin/php $ php -v PHP 5.4.24 (cli) (built: Jan 19 2014 21:32:15) Copyright (c) 1997-2013 The PHP Group Zend Engine v2.4.0, Copyright (c) 1998-2013 Zend Technologies

    Once again we’re using the default version. You may not need this, but it is handy to know how to get rid of phpbrew. The which command will also help you out with debugging.

    PHPbrew by itself is a useful tool, and you might find this is all you need. That said, these versions of PHP can get complicated when you find that various projects need more extensions. PHPBrew can add PECL extensions, but not a project-by-project basis.

    It also assumes you can remember which version of PHP an application should be using. It might not be the default, and running it under another version could cause problems.

    To do this, they need to look at using another tool on top of PHPbrew.


    VirtPHP lets you make isolated PHP environments on a single machine, like Python’s virtualenv. That might sound a little complicated but the idea is simple.

    To start, you mark one directory which contains an application or component, and give it a name. Imagine they were working on “airpair-api” and its a PHP app, we’d want to make that its own environment.

    Then, they could install PECL extensions that “airpair-api” needs, without affecting other applications.

    That’s the theory, so let’s take a look at how they do that.

    Installing virtPHP

    Go to the virtPHP releases page and find the latest release. It will have a link saying “virtphp.phar”, and you’ll want to right click and copy that URL.

    $ wget chmod +x virtphp.phar sudo mv virtphp.phar /usr/bin/virtphp

    Now they can check to see if it is working:

    $ virtphp -V virtPHP version v0.5.1-alpha 2014-08-13 16:05:47 Creating Environments

    VirtPHP maintains a relaxed approach to which version of PHP is in use. When you go to create an environment, it will take whichever version of PHP you have in your console session and reference that.

    So, before they try to make a new environment, they need to be certain they are using the right version.

    $ which php /Users/phil/.phpbrew/php/php-5.6.0/bin/php

    Oops, it’s still using the default version and I want to make sure my codebase is working on PHP 5.5.

    $ phpbrew use 5.5.17 $ which php /Users/phil/.phpbrew/php/php-5.5.17/bin/php

    Perfect, the currently enabled version of PHP is 5.5.17, which in this example is the one they want.

    Now they can make an environment.

    virtphp create airpair-api

    You will see a lot of output, and if goes well then you should see the following:

    Your virtual php environment (airpair-api) has been created! You can activate your new environment using: ~$ source /Users/phil/.virtphp/envs/airpair-api/bin/activate

    At this point the new environment is ready to use, but not enabled. Copy the command it gives you and run it, or run this shorter version:

    $ source ~/.virtphp/envs/airpair-api/bin/activate

    Now you should see the name of the environment (airpair-api) in the console prompt, before the $ character. This lets you know that you are in an environment, so you can deactivate it or act accordingly.

    Playing in the Sandbox

    Now they have this environment, they can install and configure things without affecting other PHP installations.

    One great use-case for using environments is being able to install PECL extensions. Not only can you test how an app works with or without the extension, but you can try different versions too.

    (airpair-api) $ pecl install xdebug (airpair-api) $ pecl install memcached-1.0.2

    This helps us install the great debugging tool Xdebug, and install the [memcached] extension.

    At the time of writing, the PECL command installs packages via the PEAR-based system. In future versions of virtPHP, PECL extensions will install via the new and improved Pickle system. This will remove a few issues that OS X seems to have with supporting PEAR.

    Exiting an Environment

    To check if you’re still using an environment, two things will help. The first clue is to see the environment name in brackets in your command prompt. The second is to use which php and see if its pointing to a virtPHP environment.

    (airpair-api) $ which php /Users/phil/.virtphp/envs/airpair-api/bin/php (airpair-api) $ deactivate $ which php /Users/phil/.phpbrew/php/php-5.6.0/bin/php

    There you can see they were using the airpair-api environment. Then after deactivating it, the console fell back to using 5.6.0 installed from PHPbrew, as that is default.


    Playing with many installed versions like this can at first seem a little confusing. In reality, there is much less to learn here than trying to teach a beginner developer all about a full stack.

    If a new developer was to try and build a basic PHP app, traditionally they would go through the following steps to get started:

  • Apache/nginx
  • MySQL
  • Virtual Hosts and /etc/hosts
  • Hack the core OS PHP version
  • Maybe install XAMPP/WAMP/MAMP instead
  • Try to upgrade the core versions or *AMP versions
  • Get confused that system PHP and MAMP PHP is different
  • Try to install PECL extensions to MAMP version, but install them to system PHP instead
  • You can avoid a lot of that pain with a tool like Vagrant and a provision script, but that assumes that this beginner is in a team. If going solo, the beginner would have a much harder time getting started.

    This is how the Ruby on Rails community has done things for years. Teach beginners a framework, abstract away a lot of the hard stuff, get them building and let them learn more about it all as they grow.

    Grab your PHP version, install what you need, run the PHP development server with php -S and only beef up your stack when (or if) you need to.

    Dev/Prod parity is important, but sometimes you can get away with not caring too much if its just a simple little HTTP service. If you already have CI testing in place then this is even more true.

    Finaly, even if you don’t want to run the code through the development server, having PHPbrew and virtPHP are still useful. You can install new versions as soon as they come out to play with the new syntax, and not break all your apps.

    Senior Engineer at, Author of “Build APIs You Wont Hate” and a PHP standards activist for the Framework Interoperability Group.

    A Beginner's Guide to Using pyGTK and Glade | real questions and Pass4sure dumps

    The beauty of pyGTK and Glade is they have opened up cross-platform, professional-quality GUI development to those of us who'd rather be doing other things but who still need a GUI on top of it all. Not only does pyGTK allow neophytes to create great GUIs, it also allows professionals to create flexible, dynamic and powerful user interfaces faster than ever before. If you've ever wanted to create a quick user interface that looks good without a lot of work, and you don't have any GUI experience, read on.

    This article is the direct result of a learning process that occurred while programming Immunity CANVAS ( Much of what was learned while developing the GUI from scratch was put in the pyGTK FAQ, located at Another URL you no doubt will be using a lot if you delve deeply into pyGTK is the documentation at It is fair to say that for a small company, using pyGTK over other GUI development environments, such as native C, is a competitive advantage. Hopefully, after reading this article, everyone should be able to put together a GUI using Python, the easiest of all languages to learn.

    As a metric, the CANVAS GUI was written from scratch, in about two weeks, with no prior knowledge of pyGTK. It then was ported from GTK v1 to GTK v2 (more on that later) in a day, and it is now deployed to both Microsoft Windows and Linux customers.

    The Cross-Platform Nature of pyGTK

    In a perfect world, you never would have to develop for anything but Linux running your favorite distribution. In the real world, you need to support several versions of Linux, Windows, UNIX or whatever else your customers need. Choosing a GUI toolkit depends on what is well supported on your customers' platforms. Nowadays, choosing Python as your development tool in any new endeavor is second nature if speed of development is more of a requirement than runtime speed. This combination leads you to choose from the following alternatives for Python GUI development: wxPython, Tkinter, pyGTK and Python/Qt.

    Keeping in mind that I am not a professional GUI developer, here are my feelings on why one should chose pyGTK. wxPython has come a long way and offers attractive interfaces but is hard to use and get working, especially for a beginner. Not to mention, it requires both Linux and Windows users to download and install a large binary package. Qt, although free for Linux, requires a license to be distributed for Windows. This probably is prohibitive for many small companies who want to distribute on multiple platforms.

    Tkinter is the first Python GUI development kit and is available with almost every Python distribution. It looks ugly, though, and requires you to embed Tk into your Python applications, which feels like going backward. For a beginner, you really want to split the GUI from the application as much as possible. That way, when you edit the GUI, you don't have to change a bunch of things in your application or integrate any changes into your application.

    For these reasons alone, pyGTK might be your choice. It neatly splits the application from the GUI. Using libglade, the GUI itself is held as an XML file that you can continue to edit, save multiple versions of or whatever else you want, as it is not integrated with your application code. Furthermore, using Glade as a GUI builder allows you to create application interfaces quickly—so quickly that if multiple customers want multiple GUIs you could support them all easily.

    Version Issues with GTK and pyGTK

    Two main flavors of GTK are available in the wild, GTK versions 1 and 2. Therefore, at the start of a GUI-building project, you have to make some choices about what to develop and maintain. It is likely that Glade v1 came installed on your machine. You may have to download Glade v2 or install the development packages for GTK to compile the GTK v2 libglade. Believe me, it is worth the effort. GTK v2 offers several advantages, including a nicer overall look, installers for Windows with Python 2.2 and accessibility extensions that allow applications to be customized for blind users. In addition, version 2 comes installed on many of the latest distributions, although you still may need to install development RPMs or the latest pyGTK package.

    GTK v2 and hence pyGTK v2 offer a few, slightly more complex widgets (Views). In the hands of a mighty GUI master, they result in awesome applications, but they really confuse beginners. However, a few code recipes mean you can treat them as you would their counterparts in GTK v1, once you learn how to use them.

    As an example, after developing the entire GUI for CANVAS in GTK v1, I had to go back and redevelop it (which took exactly one day) in GTK v2. Support was lacking for GTK v1 in my customers' Linux boxes, but installing GTK v2 was easy enough. The main exception is Ximian Desktop, which makes pyGTK and GTK v1 easy to install. So, if your entire customer base is running that, you may want to stay with GTK v1. One thing to keep in mind though—a Python script is available for converting projects from Glade v1 to Glade v2, but not vice versa. So if you're going to do both, develop it first in Glade v1, convert it and then reconcile any differences.

    An Introduction to Glade v2

    The theory behind using Glade and libglade is it wastes time to create your GUI using code. Sitting down and telling the Python interpreter where each widget goes, what color it is and what the defaults are is a huge time sink. Anyone who's programmed in Tcl/Tk has spent days doing this. Not only that, but changing a GUI created with code can be a massive undertaking at times. With Glade and libglade, instead of creating code, you create XML files and code links to those files wherever a button or an entry box or an output text buffer is located.

    To start, you need Glade v2 if you don't have it already. Even if you do, you may want the latest version of it. Downloading and installing Glade v2 should be easy enough once you have GTK v2 development packages (the -devel RPMs) installed. However, for most people new to GUI development, the starting window for Glade is intimidatingly blank.

    To begin your application, click the Window Icon. Now, you should have a big blank window on your screen (Figure 1).

    Figure 1. The cross-hatched area in the starting window is a place to put another widget.

    The important thing to learn about GUI development is there are basically two types of objects: widgets, such as labels and entry boxes and other things you can see, and containers for those widgets. Most likely, you will use one of three kinds of containers, the vertical box, the horizontal box or the table. To create complex layouts, its easiest to nest these containers together in whatever order you need. For example, click on the horizontal box icon. Clicking on the hatched area in window1 inserts three more areas where you can add widgets. Your new window1 should look like Figure 2.

    Figure 2. A basic three-pane vbox with the top pane selected.

    You now can select any of those three areas and further divide it with a vertical box. If you don't like the results, you always can go back and delete, cut and paste or change the number of boxes from the Properties menu (more on that later).

    Figure 3. The top pane has been split by a two-pane hbox, which is selected.

    You can use these sorts of primitives to create almost any sort of layout. Now that they have a beginning layout, they can fill it with widgets that actually do something. In this case, I'll fill them with a label, a text entry, a spinbutton and a button. At first this looks pretty ugly (Figure 4).

    Figure 4. The initial window filled in with widgets.

    Remember that GTK auto-adjusts the sizes of the finished product when it is displayed, so everything is packed together as tightly as possible. When the user drags the corner of the window, it's going to auto-expand as well. You can adjust these settings in the Properties window (go to the main Glade window and click View→Show Properties). The Properties window changes different values for different kinds of widgets. If the spinbutton is focused, for example, they see the options shown in Figure 5.

    Figure 5. The Glade interface for changing a widget's properties is customized for each type of widget.

    By changing the Value option, they can change what the spinbutton defaults to when displayed. Also important is to change the Max value. A common mistake is to change the Value to something high but forget the Max, which causes the spinbutton initially to display the default but then revert to the Max value when it is changed, confusing the user. In their case, we're going to use the spinbutton as a TCP port, so I'll set it to 65535, the minimum to 1 and the default to 80.

    Then, focus on the label1 and change it to read Host:. By clicking on window1 in the main Glade window, you can focus on the entire window, allowing you to change its properties as well. You also can do this by bringing up the widget tree window and clicking on window1. Changing the name to serverinfo and the title to Server Info sets the titlebar and the internal Glade top-level widget name appropriately for this application.

    If you go to the widget tree view and click on the hbox1, you can increase the spacing between Host: and the text-entry box. This may make it look a little nicer. Their finished GUI looks like Figure 6.

    Figure 6. The GUI in Glade does not look exactly like it does when rendered, so don't worry about the size of the Host: area.

    Normally, this would take only a few minutes to put together. After a bit of practice you'll find that putting together even the most complex GUIs using Glade can be accomplished in minutes. Compare that to the time it takes to type in all those Tk commands manually to do the same thing.

    This GUI, of course, doesn't do anything yet. They need to write the Python code that loads the .glade file and does the actual work. In fact, I tend to write two Python files for each Glade-driven project. One file handles the GUI, and the other file doesn't know anything about that GUI. That way, porting from GTK v1 to GTK v2 or even to another GUI toolkit is easy.

    Creating the Python Program

    First, they need to deal with any potential version skew. I use the following code, although a few other entries mentioned in the FAQ do similar things:

    #!/usr/bin/env python import sys try: import pygtk #tell pyGTK, if possible, that they want GTKv2 pygtk.require("2.0") except: #Some distributions come with GTK2, but not pyGTK pass try: import gtk import except: print "You need to install pyGTK or GTKv2 ", print "or set your PYTHONPATH correctly." print "try: export PYTHONPATH=", print "/usr/local/lib/python2.2/site-packages/" sys.exit(1) #now they have both gtk and imported #Also, they know they are running GTK v2

    Now are going to create a GUI class called appGUI. Before they do that, though, they need to open button1's properties and add a signal. To do that, click the three dots, scroll to clicked, select it and then click Add. You should end up with something like Figure 7.

    Figure 7. After Adding the Event (Signal) Handler

    With this in place, the signal_autoconnect causes any click of the button to call one of their functions (button1_clicked). You can see the other potential signals to be handled in that list as well. Each widget may have different potential signals. For example, capturing a text-changed signal on a text-entry widget may be useful, but a button never changes because it's not editable.

    Initializing the application and starting gtk.mainloop() gets the ball rolling. Different event handlers need to have different numbers of arguments. The clicked event handler gets only one argument, the widget that was clicked. While you're at it, add the destroy event to the main window, so the program exits when you close the window. Don't forget to save your Glade project.

    class appgui: def __init__(self): """ In this init they are going to display the main serverinfo window """ gladefile="" windowname="serverinfo" (gladefile,windowname) # they only have two callbacks to register, but # you could register any number, or use a # special class that automatically # registers all callbacks. If you wanted to pass # an argument, you would use a tuple like this: # dic = { "on button1_clicked" : \ (self.button1_clicked, arg1,arg2) , ... dic = { "on_button1_clicked" : \ self.button1_clicked, "on_serverinfo_destroy" : \ (gtk.mainquit) } self.wTree.signal_autoconnect (dic) return #####CALLBACKS def button1_clicked(self,widget): print "button clicked" # they start the app like this... app=appgui() gtk.mainloop()

    It's important to make sure, if you installed pyGTK from source, that you set the PYTHONPATH environment variable to point to /usr/local/lib/python2.2/site-packages/ so pyGTK can be found correctly. Also, make sure you copy into your current directory. You should end up with something like Figure 8 when you run your new program. Clicking GO! should produce a nifty button-clicked message in your terminal window.

    Figure 8. The Initial Server Info GUI

    To make the application actually do something interesting, you need to have some way to determine which host and which port to use. The following code fragment, put into the button1_clicked() function, should do the trick:

    host=self.wTree.get_widget("entry1").get_text() port=int(self.wTree.get_widget( "spinbutton1").get_value()) if host=="": return import urllib page=urllib.urlopen( "http://"+host+":"+str(port)+"/") print data

    Now when GO! is clicked, your program should go off to a remote site, grab a Web page and print the contents on the terminal window. You can spice it up by adding more rows to the hbox and putting other widgets, like a menubar, into the application. You also can experiment with using a table instead of nested hboxes and vboxes for layout, which often creates nicer looking layouts with everything aligned.


    You don't really want all that text going to the terminal, though, do you? It's likely you want it displayed in another widget or even in another window. To do this in GTK v2, use the TextView and TextBuffer widgets. GTK v1 had an easy-to-understand widget called, simply, GtkText.

    Add a TextView to your Glade project and put the results in that window. You'll notice that a scrolledwindow is created to encapsulate it. Add the lines below to your init() to create a TextBuffer and attach it to your TextView. Obviously, one of the advantages of the GTK v2 way of doing things is the two different views can show the same buffer. You also may want to go into the Properties window for scrolledwindow1 and set the size to something larger so you have a decent view space:

    self.logwindowview=self.wTree.get_widget("textview1") self.logwindow=gtk.TextBuffer(None) self.logwindowview.set_buffer(self.logwindow)

    In your button1_clicked() function, replace the print statement with:


    Now, whenever you click GO! the results are displayed in your window. By dividing your main window with a set of vertical panes, you can resize this window, if you like (Figure 9).

    Figure 9. Clicking GO! loads the Web page and displays it in the TextView.

    TreeViews and Lists

    Unlike GTK v1, under GTK v2 a tree and a list basically are the same thing; the difference is the kind of store each of them uses. Another important concept is the TreeIter, which is a datatype used to store a pointer to a particular row in a tree or list. It doesn't offer any useful methods itself, that is, you can't ++ it to step through the rows of a tree or list. However, it is passed into the TreeView methods whenever you want to reference a particular location in the tree. So, for example:

    import gobject self.treeview=[2]self.wTree.get_widget("treeview1") self.treemodel=gtk.TreeStore(gobject.TYPE_STRING, gobject.TYPE_STRING) self.treeview.set_model(self.treemodel)

    defines a tree model with two columns, each containing a string. The following code adds some titles to the top of the columns:

    self.treeview.set_headers_visible(gtk.TRUE) renderer=gtk.CellRendererText() column=gtk.TreeViewColumn("Name",renderer, text=0) column.set_resizable(gtk.TRUE) self.treeview.append_column(column) renderer=gtk.CellRendererText() column=gtk.TreeViewColumn("Description",renderer, text=1) column.set_resizable(gtk.TRUE) self.treeview.append_column(column)

    You could use the following function to add data manually to your tree:

    def insert_row(model,parent, firstcolumn,secondcolumn): myiter=model.insert_after(parent,None) model.set_value(myiter,0,firstcolumn) model.set_value(myiter,1,secondcolumn) return myiter

    Here's an example that uses this function. Don't forget to add treeview1 to your glade file, save it and copy it to your local directory:

    model=self.treemodel insert_row(model,None,'Helium', 'Control Current Helium') syscallIter=insert_row(model,None, 'Syscall Redirection', 'Control Current Syscall Proxy') insert_row(model,syscallIter,'Syscall-shell', 'Pop-up a syscall-shell')

    The screenshot in Figure 10 shows the results. I've replaced the TextView with a TreeView, as you can see.

    Figure 10. An Example TreeView with Two Columns

    A list is done the same way, except you use ListStore instead of TreeStore. Also, most likely you will use ListStore.append() instead of insert_after().

    Using Dialogs

    A dialog differs from a normal window in one important way—it returns a value. To create a dialog box, click on the dialog box button and name it. Then, in your code, render it with [3],dialogboxname). Then call get_widget(dialogboxname) to get a handle to that particular widget and call its run() method. If the result is gtk.RESPONSE_OK, the user clicked OK. If not, the user closed the window or clicked Cancel. Either way, you can destroy() the widget to make it disappear.

    One catch when using dialog boxes: if an exception happens before you call destroy() on the widget, the now unresponsive dialog box may hang around, confusing your users. Call widget.destroy() right after you receive the response and all the data you need from any entry boxes in the widget.

    Using input_add() and gtk.mainiteration() to Handle Sockets

    Some day, you probably will write a pyGTK application that uses sockets. When doing so, be aware that while your events are being handled, the application isn't doing anything else. When waiting on a socket.accept(), for example, you are going to be stuck looking at an unresponsive application. Instead, use gtk.input_add() to add any sockets that may have read events to GTK's internal list. This allows you to specify a callback to handle whatever data comes in over the sockets.

    One catch when doing this is you often want to update your windows during your event, necessitating a call to gtk.mainiteration(). But if you call gtk.mainiteration() while within gtk.mainiteration(), the application freezes. My solution for CANVAS was to wrap any calls to gtk.mainiteration() within a check to make sure I wasn't recursing. I check for pending events, like a socket accept(), any time I write a log message. My log function ends up looking like this:

    def log(self,message,color): """ logs a message to the log window right now it just ignores the color argument """ message=message+"\n" self.logwindow.insert_at_cursor(message, len(message)) self.handlerdepth+=1 if self.handlerdepth==1 and \ gtk.events_pending(): gtk.mainiteration() self.handlerdepth-=1 return

    Moving a GUI from GTK v1 to GTK v2

    The entry in the pyGTK FAQ on porting your application from GTK v1 to GTK v2 is becoming more and more complete. However, you should be aware of a few problems you're going to face. Obviously, all of your GtkText widgets need to be replaced with Gtk.TextView widgets. The corresponding code in the GUI also must be changed to accommodate that move. Likewise, any lists or trees you've done in GTK v1 have to be redone. What may come as a surprise is you also need to redo all dialog boxes, remaking them in GTK v2 format, which looks much nicer.

    Also, a few syntax changes occurred, such as GDK moving to gtk.gdk and libglade moving to For the most part, these are simple search and replaces. Use GtkText.insert_defaults instead of GtkTextBuffer.insert_at_cursor() and radiobutton.get_active() instead of, for example. You can convert your Glade v1 file into a Glade v2 file using the libglade distribution's Python script. This gets you started on your GUI, but you may need to load Glade v2 and do some reconfigurations before porting your code.

    Final Notes

  • Don't forget you can cut and paste from the Glade widget tree. This can make a redesign quick and painless.

  • Unset any possible positions in the Properties window so your startup doesn't look weird.

  • If you have a question you think other people might too, add it to the pyGTK FAQ.

  • The GNOME IRC server has a useful #pygtk channel. I couldn't have written CANVAS without the help of the people on the channel, especially James Henstridge. It's a tribute to the Open Source community that the principal developers often are available to answer newbie questions.

  • The finished demo code is available from

    Deploying CLR Assemblies with T-SQL | real questions and Pass4sure dumps

    Microsoft introduced the ability to use .NET CLR stored procedures and functions in SQL Server some time ago, starting with SQL Server 2005. Now more than 8 years later I think many developers are like me: I acknowledge the power of CLR routines, but try to avoid using CLR.

    Part of the reason for this avoidance has to do with technical considerations. But truthfully for me, part of the reason also has to do with the increased complexity that CLR introduces into development, deployment, and maintenance of the database.

    This article will demonstrate an approach to deploying and managing CLR routines that may be more comfortable for T-SQL developers and DBA's, and one that does not involve use of Visual Studio. This approach also encapsulates everything needed to deploy the CLR assembly within the database, meaning that a database backup will store all needed dependencies.

    The basic goal of this exercise is to create a stored procedure that when executed will compile C# code, sign the .DLL, register the assembly in SQL, and create the wrapper SQL objects, all within this stored procedure. In this way, deployment of the CLR assembly is as easy as running a stored procedure. Everything is taken care of, and is all in one place: no independent .DLL 's, Visual Studio projects, or C# source to keep track of.

    Additionally, this exercise attempts to follow best practices for deployment, such as signing the assembly and properly securing it in SQL. These are things that often get omitted when in a hurry to set up a CLR assembly in SQL.


    For those who just want to skip to the final result: I have created a stored procedure to deploy a sample assembly as follows:

    CREATE PROCEDURE dbo.spExample_RegisterAssembly_PDFCLR AS BEGIN DECLARE @FilePath varchar(1024) SET @FilePath = 'c:\ServerEnvironment\' CREATE TABLE #References (AssemblyName sysname, FQFileName varchar(1024)) INSERT INTO #References (AssemblyName, FQFileName) VALUES ('System.Drawing', 'C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.Drawing.dll') INSERT INTO #References (AssemblyName, FQFileName) VALUES ('itextsharp', @FilePath + 'itextsharp.dll') DECLARE @DropWrapperSQL varchar(MAX) SET @DropWrapperSQL = ' IF OBJECT_ID(''dbo.udfRenderPDF'') IS NOT NULL BEGIN DROP FUNCTION dbo.udfRenderPDF; END ' DECLARE @CreateWrapperSQL varchar(MAX) SET @CreateWrapperSQL = ' CREATE FUNCTION [dbo].[udfRenderPDF]( @TemplatePDF varbinary(MAX), @FieldsXML xml ) RETURNS [varbinary](max) WITH EXECUTE AS CALLER AS EXTERNAL NAME [PDFCLR].[Functions].[RenderPDF] ' --C# Source Code. --Paste CLR source in below. Replace all occurrences a single quote with two single quotes. DECLARE @SourceCode nvarchar(MAX) SET @SourceCode = ' //------start of CLR Source------ using System; using System.Data; using System.Data.SqlClient; using System.Data.SqlTypes; using Microsoft.SqlServer.Server; ....Rest of C# source code goes here //------end of CLR Source------ ' EXEC dbo.spsysBuildCLRAssembly @AssemblyName = 'PDFCLR', @FileName = 'PDFCLR_SQLCLR.cs', @FilePath = @FilePath, @DropWrapperSQL = @DropWrapperSQL, @CreateWrapperSQL = @CreateWrapperSQL, @SourceCode = @SourceCode END

    Calling this spExample_RegisterAssembly_PDFCLR procedure will build the C# source code provided in @SourceCode, and will sign the .DLL, register all referenced assemblies, create an asymmetric key and associated login for each assembly, and create the assembly in SQL, along with wrapper procedures. (See the attached files to download the required routines.)

    In this way, executing your stored procedure will do everything needed to build and deploy this CLR assembly–even if you restore your database to a different server.

    There is no need to use Visual Studio, or to access any external files: everything is encapsulated in your database, and can be run from a simple T-SQL stored procedure.


    Here is a step-by-step list of the work this stored procedure needs to do:

  • Retrieve C# source from SQL
  • Write C# source to a temporary .cs file
  • Enable CLR support in the database (if necessary)
  • Temporarily enable xp_cmdshell (just for the duration of execution of this procedure)
  • Write and execute a batch file that does the following:
  • Generate signature with the command line "sn" (Strong Name) tool
  • Build C# source into a signed DLL using the command line "csc" (C Sharp Compiler) compiler
  • Disable xp_cmdshell (for security reasons)
  • Drop the SQL wrapper function that wraps the CLR method (if it exists)
  • Drop the CLR assembly (if it exists)
  • Create key to secure the assembly:
  • Create an asymmetric key (dropping if it exists)
  • Create a SQL login from the key (dropping if it exists)
  • Grant rights to the login
  • Create the assembly in SQL
  • Create the SQL wrapper function that wraps the CLR method
  • As easy as 1, 2, 3…11. And that is part of what I mean about the complexity of deploying and maintaining CLR assemblies in SQL: there are lots of steps to learn how to do (and then remember to do them). These steps need to be done every time you deploy this database to a new server. Being able to do all of these things by executing a single stored procedure simplifies things greatly.

    (Note that the sequence of some of these steps has been altered slightly in the final version of the code that is attached.)

    Step 1: Retrieve C# Source from SQL

    We are trying to avoid storing the C# source in a file because they want everything that is needed to create the assembly to be encapsulated in the database. The source could be stored in a table, or, as I have done here, the source code can be stored as a string literal inside the stored procedure.

    What I have done is copy-and-pasted the C# source from Visual Studio, then used search-and-replace to replace single quote characters with two single quote characters, and then assigned this string to a variable which will later get written out to a temporary .cs file.


    DECLARE @CLRSource nvarchar(MAX) SET @CLRSource = ' //------start of CLR Source------ using System; using System.Data; using System.Data.SqlClient; using System.Data.SqlTypes; using Microsoft.SqlServer.Server; .... //------end of CLR Source------ '

    Step 2: Write C# source to a temporary .cs file

    Phil Factor writes some usefull SQL code. One of his articles gives us a utility procedure they can use to easily write a string to a file. I use this procedure to write the C# source to a .cs file.


    EXEC dbo.sputilWriteStringToFile @FileData = @CLRSource, @FilePath = @FilePath, @FileName = @FileName Step 3: Enable CLR support

    This procedure will deploy a CLR assembly. Obviously they need CLR support enabled in the database.


    IF NOT EXISTS(SELECT * FROM sys.configurations WHERE name = 'clr enabled') BEGIN SET @SQL =' EXEC master.dbo.sp_configure ''show advanced options'', 1 RECONFIGURE EXEC master.dbo.sp_configure ''clr'', 1 RECONFIGURE' EXEC(@SQL) END Step 4: Temporarily enable xp_cmdshell

    I fully understand and agree that xp_cmdshell can introduce a number of security problems, and is best avoided in production databases. My approach here is that this stored procedure will enable xp_cmdshell temporarily. It will be enabled just long enough to call a batch file that the procedure will dynamically create.

    In my opinion, this use of xp_cmdshell is safe and appropriate: it will only be called at deploy-time by an administrator, will be used to execute carefully scripted statements, and will be immediately disabled.


    SET @SQL = ' EXEC master.dbo.sp_configure ''show advanced options'', 1 RECONFIGURE EXEC master.dbo.sp_configure ''xp_cmdshell'', 1 RECONFIGURE' EXEC(@SQL)

    Step 5: Create a batch file that will be executed

    We need to execute the strong name command line application (sn.exe), and also the command line C# compiler (csc.exe)


    This CLR assembly requires iTextSharp, an open source library for creating PDF 's (from ). Download, and copy the itextsharp.dll file to c:\ServerEnvironment (or a folder of your choosing, updating the script as needed).


    The sn.exe and csc.exe utilities are part of the "Windows SDK for Windows Server 2008 and .NET Framework 3.5 ", available as a free download at


    SQL Server 2005 and 2008 CLR support is limited to .NET Framework 3.5. SQL Server 2012 introduces support for .NET Framework 4.0, but can run .NET Framework 3.5. This procedure uses .NET Framework 3.5–which is their only option on SQL 2005, 2008, and 2008 R2.

    Figuring out all the command line parameters necessary took a bit of research, but now that is done the procedure can automatically output the needed parameters to the batch file.


    DECLARE @Command varchar(2048) SET @Command = "C:\Program Files\Microsoft SDKs\Windows\v6.1\Bin\sn" -k ' + @FilePath + '\' + 'PDFCLR_keyPair.snk' + @CRLF + '"C:\Windows\Microsoft.NET\Framework\v3.5\csc" /t:library' + ' /reference:c:\ServerEnvironment\itextsharp.dll' + ' /out:' + @FilePath + '\' + REPLACE(@FileName, '.cs', '.dll') + ' /keyfile:' + @FilePath + '\' + 'PDFCLR_keyPair.snk' + ' ' + @FilePath + '\' + @FileName EXEC dbo.sputilWriteStringToFile @FileData = @Command, @FilePath = @FilePath, @FileName = 'tmp.bat'

    Step 6: Disable xp_cmdshell

    We don 't want to leave xp_cmdshell enabled, and the procedure is done with it.


    SET @SQL = ' EXEC master.dbo.sp_configure ''show advanced options'', 1 RECONFIGURE EXEC master.dbo.sp_configure ''xp_cmdshell'', 0 RECONFIGURE' EXEC(@SQL)

    Step 7: Drop the wrapper SQL function

    CLR assemblies expose methods, but SQL requires a SQL function that is tied to the method in the assembly. Since they want to drop the assembly if it exists, they must first drop the wrapper function.


    IF OBJECT_ID('dbo.udfRenderPDF') IS NOT NULL BEGIN IF @Debug = 1 PRINT '***Dropping existing function' SET @SQL = 'DROP FUNCTION dbo.udfRenderPDF' EXEC(@SQL) END Step 8: Drop the existing CLR assembly, if it exists

    We want to replace the existing assembly (if any), so they have to drop it if it exists.


    IF ASSEMBLYPROPERTY ('PDFCLR' , 'MvID') IS NOT NULL BEGIN IF @Debug = 1 PRINT '***Dropping existing CLR assembly' SET @SQL = 'DROP ASSEMBLY PDFCLR' EXEC(@SQL) END Step 9: Create key to secure the assembly

    This is one of the harder parts to understand, but a detailed explanation is beyond the scope of this article. I 'll try to provide a brief overview:

    CLR code can do anything, including destructive or malicious things. CLR code that does potentially dangerous things (such as deleting files from the file system) gets flagged as "unsafe ". SQL prevents "unsafe " CLR assemblies from being loaded in an effort to protect the server environment from destructive or malicious things. SQL will allow "unsafe " CLR assemblies if one of two things is true: a) the TRUSTWORTHY database property is enabled, or b) the assembly is signed and tied to a key and login in SQL.

    TRUSTWORTHY is a bad idea, because basically it says that ANY "unsafe " assembly can be loaded. They don 't want to open the door to load any and every "unsafe " assembly. If they did, a user could register dangerous or malicious .DLLs without the DBA's knowledge. Also, someone could potentially change the .DLL in the file system without the DBA's knowledge–and SQL would then continue to allow users to call methods in the now-rogue assembly. (Think of TRUSTWORTHY as being SQL deeming the entire physical server and everything on it as being safe or "trustworthy".)

    Signing the assembly is a much better idea. It is slightly complicated to do, but the concept isn 't too hard. This involves signing the assembly with a cryptographic signature, creating an asynchronous key in SQL based on this signature, creating a SQL login associated with the key, and granting appropriate rights to this login. This in effect gives us the ability to say that the specified user is allowed to load this specific "unsafe " assembly.

    Putting this another way, signing the assembly guarantees the DBA that only assemblies approved by the DBA will be used by SQL. I will not dig into what all is going on in the mechanics of signing the code, but will instead just show you how to do it.



    Step 10: Create the assembly in SQL

    Now they can create the assembly in SQL. This terminology can be a little confusing, as the .NET assembly is already created (i.e. the C# code has already been compiled and the .DLL already exists.) Really what they are doing here is "registering " the assembly for use by SQL, though the SQL command is "CREATE ASSEMBLY ".


    SET @SQL = ' CREATE ASSEMBLY PDFCLR FROM ''' + @FilePath + '\' + REPLACE(@FileName, '.cs', '.dll') + ''' WITH PERMISSION_SET = UNSAFE' EXEC(@SQL)

    NOTE: This particular assembly (that renders PDF documents) requires "unsafe " operations. Some assemblies may not require "unsafe " operations, and can thus have a different setting for PERMISSION_SET.

    Step 11: Create the SQL wrapper function

    Finally they can create the SQL wrapper function associated with the method in the CLR assembly. Parameters and types in the SQL wrapper must exactly match those in the C# code.


    SET @SQL = ' CREATE FUNCTION [dbo].[udfRenderPDF]( @TemplatePDF varbinary(MAX), @FieldsXML xml ) RETURNS [varbinary](max) WITH EXECUTE AS CALLER AS EXTERNAL NAME [PDFCLR].[Functions].[RenderPDF]' EXEC(@SQL)

    Trying it out

    Finally, they can try out the results of all their hard work by executing the new function they just created, and seeing how it renders a PDF file.

    (Little is said here of what this assembly actually does or how to use it. Say tuned for tomorrow's article ""Rendering PDFs Natively in SQL" for details on this particular assembly.)


    DECLARE @FieldsXML xml SET @FieldsXML = CAST( '<Fields> <Field> <TextValue>Hello World</TextValue> <XPos>100</XPos> <YPos>700</YPos> <FontSize>18</FontSize> </Field> <Field> <TextValue>One more line, just for fun.</TextValue> <XPos>150</XPos> <YPos>650</YPos> <FontSize>12</FontSize> </Field> </Fields>' AS xml) DECLARE @PDFTemplate varbinary(MAX) SET @PDFTemplate = NULL DECLARE @ResultPDF varbinary(MAX) SET @ResultPDF = dbo.udfRenderPDF(@PDFTemplate, @FieldsXML) /*The PDF file now exists in the @ResultPDF variable. You can do whatever you want with the data. To write the binary data to a file on the server so that you can open it in Adobe Acrobat Reader you can use this utility procedure (see attached). */ EXEC [dbo].[sputilWriteBinaryToFile] @FileData = @ResultPDF, @FilePath = 'C:\Temp', @Filename = 'test.pdf' Summary

    There are a lot of steps involved in properly deploying a CLR assembly in SQL. But the good news is that once these steps are encapsulated within a stored procedure, the procedure can be executed any time the CLR source code is updated and whenever you need to deploy the CLR assembly to a different machine.

    Both the C# source and the script to build, sign and register it are resident in the SQL database–and as such get backed up and restored along with all other SQL objects. The DBA can see exactly what is going on in the assembly–both in terms of the C# source and the various compiler options–all in one place, by simply looking at the source of this stored procedure. Also, the DBA doesn't even need to open Visual Studio: everything can be done from native T-SQL

    Visual Studio is a fine development tool, and is useful when developing the C# code. But in my opinion, a deployment script implemented in a SQL stored procedure is a much nicer way for a T-SQL developer or DBA to deploy and update CLR assemblies in SQL.

    Will I use CLR for everything? No, definitely not. But now when I need to use a CLR assembly I can now do so with greater safety and greater ease than I could without the techniques described in this article.

    (See attached file for full source code.  You can download and execute BuildAndRegisterCLRAssembly.sql to create all procedures and functions referenced here, as well as to execute the example shown above.)

    Direct Download of over 5500 Certification Exams

    3COM [8 Certification Exam(s) ]
    AccessData [1 Certification Exam(s) ]
    ACFE [1 Certification Exam(s) ]
    ACI [3 Certification Exam(s) ]
    Acme-Packet [1 Certification Exam(s) ]
    ACSM [4 Certification Exam(s) ]
    ACT [1 Certification Exam(s) ]
    Admission-Tests [13 Certification Exam(s) ]
    ADOBE [93 Certification Exam(s) ]
    AFP [1 Certification Exam(s) ]
    AICPA [2 Certification Exam(s) ]
    AIIM [1 Certification Exam(s) ]
    Alcatel-Lucent [13 Certification Exam(s) ]
    Alfresco [1 Certification Exam(s) ]
    Altiris [3 Certification Exam(s) ]
    Amazon [2 Certification Exam(s) ]
    American-College [2 Certification Exam(s) ]
    Android [4 Certification Exam(s) ]
    APA [1 Certification Exam(s) ]
    APC [2 Certification Exam(s) ]
    APICS [2 Certification Exam(s) ]
    Apple [69 Certification Exam(s) ]
    AppSense [1 Certification Exam(s) ]
    APTUSC [1 Certification Exam(s) ]
    Arizona-Education [1 Certification Exam(s) ]
    ARM [1 Certification Exam(s) ]
    Aruba [8 Certification Exam(s) ]
    ASIS [2 Certification Exam(s) ]
    ASQ [3 Certification Exam(s) ]
    ASTQB [8 Certification Exam(s) ]
    Autodesk [2 Certification Exam(s) ]
    Avaya [101 Certification Exam(s) ]
    AXELOS [1 Certification Exam(s) ]
    Axis [1 Certification Exam(s) ]
    Banking [1 Certification Exam(s) ]
    BEA [5 Certification Exam(s) ]
    BICSI [2 Certification Exam(s) ]
    BlackBerry [17 Certification Exam(s) ]
    BlueCoat [2 Certification Exam(s) ]
    Brocade [4 Certification Exam(s) ]
    Business-Objects [11 Certification Exam(s) ]
    Business-Tests [4 Certification Exam(s) ]
    CA-Technologies [20 Certification Exam(s) ]
    Certification-Board [10 Certification Exam(s) ]
    Certiport [3 Certification Exam(s) ]
    CheckPoint [43 Certification Exam(s) ]
    CIDQ [1 Certification Exam(s) ]
    CIPS [4 Certification Exam(s) ]
    Cisco [318 Certification Exam(s) ]
    Citrix [48 Certification Exam(s) ]
    CIW [18 Certification Exam(s) ]
    Cloudera [10 Certification Exam(s) ]
    Cognos [19 Certification Exam(s) ]
    College-Board [2 Certification Exam(s) ]
    CompTIA [76 Certification Exam(s) ]
    ComputerAssociates [6 Certification Exam(s) ]
    Consultant [2 Certification Exam(s) ]
    Counselor [4 Certification Exam(s) ]
    CPP-Institute [4 Certification Exam(s) ]
    CSP [1 Certification Exam(s) ]
    CWNA [1 Certification Exam(s) ]
    CWNP [13 Certification Exam(s) ]
    CyberArk [1 Certification Exam(s) ]
    Dassault [2 Certification Exam(s) ]
    DELL [11 Certification Exam(s) ]
    DMI [1 Certification Exam(s) ]
    DRI [1 Certification Exam(s) ]
    ECCouncil [22 Certification Exam(s) ]
    ECDL [1 Certification Exam(s) ]
    EMC [128 Certification Exam(s) ]
    Enterasys [13 Certification Exam(s) ]
    Ericsson [5 Certification Exam(s) ]
    ESPA [1 Certification Exam(s) ]
    Esri [2 Certification Exam(s) ]
    ExamExpress [15 Certification Exam(s) ]
    Exin [40 Certification Exam(s) ]
    ExtremeNetworks [3 Certification Exam(s) ]
    F5-Networks [20 Certification Exam(s) ]
    FCTC [2 Certification Exam(s) ]
    Filemaker [9 Certification Exam(s) ]
    Financial [36 Certification Exam(s) ]
    Food [4 Certification Exam(s) ]
    Fortinet [14 Certification Exam(s) ]
    Foundry [6 Certification Exam(s) ]
    FSMTB [1 Certification Exam(s) ]
    Fujitsu [2 Certification Exam(s) ]
    GAQM [9 Certification Exam(s) ]
    Genesys [4 Certification Exam(s) ]
    GIAC [15 Certification Exam(s) ]
    Google [4 Certification Exam(s) ]
    GuidanceSoftware [2 Certification Exam(s) ]
    H3C [1 Certification Exam(s) ]
    HDI [9 Certification Exam(s) ]
    Healthcare [3 Certification Exam(s) ]
    HIPAA [2 Certification Exam(s) ]
    Hitachi [30 Certification Exam(s) ]
    Hortonworks [4 Certification Exam(s) ]
    Hospitality [2 Certification Exam(s) ]
    HP [752 Certification Exam(s) ]
    HR [4 Certification Exam(s) ]
    HRCI [1 Certification Exam(s) ]
    Huawei [21 Certification Exam(s) ]
    Hyperion [10 Certification Exam(s) ]
    IAAP [1 Certification Exam(s) ]
    IAHCSMM [1 Certification Exam(s) ]
    IBM [1533 Certification Exam(s) ]
    IBQH [1 Certification Exam(s) ]
    ICAI [1 Certification Exam(s) ]
    ICDL [6 Certification Exam(s) ]
    IEEE [1 Certification Exam(s) ]
    IELTS [1 Certification Exam(s) ]
    IFPUG [1 Certification Exam(s) ]
    IIA [3 Certification Exam(s) ]
    IIBA [2 Certification Exam(s) ]
    IISFA [1 Certification Exam(s) ]
    Intel [2 Certification Exam(s) ]
    IQN [1 Certification Exam(s) ]
    IRS [1 Certification Exam(s) ]
    ISA [1 Certification Exam(s) ]
    ISACA [4 Certification Exam(s) ]
    ISC2 [6 Certification Exam(s) ]
    ISEB [24 Certification Exam(s) ]
    Isilon [4 Certification Exam(s) ]
    ISM [6 Certification Exam(s) ]
    iSQI [7 Certification Exam(s) ]
    ITEC [1 Certification Exam(s) ]
    Juniper [65 Certification Exam(s) ]
    LEED [1 Certification Exam(s) ]
    Legato [5 Certification Exam(s) ]
    Liferay [1 Certification Exam(s) ]
    Logical-Operations [1 Certification Exam(s) ]
    Lotus [66 Certification Exam(s) ]
    LPI [24 Certification Exam(s) ]
    LSI [3 Certification Exam(s) ]
    Magento [3 Certification Exam(s) ]
    Maintenance [2 Certification Exam(s) ]
    McAfee [8 Certification Exam(s) ]
    McData [3 Certification Exam(s) ]
    Medical [68 Certification Exam(s) ]
    Microsoft [375 Certification Exam(s) ]
    Mile2 [3 Certification Exam(s) ]
    Military [1 Certification Exam(s) ]
    Misc [1 Certification Exam(s) ]
    Motorola [7 Certification Exam(s) ]
    mySQL [4 Certification Exam(s) ]
    NBSTSA [1 Certification Exam(s) ]
    NCEES [2 Certification Exam(s) ]
    NCIDQ [1 Certification Exam(s) ]
    NCLEX [3 Certification Exam(s) ]
    Network-General [12 Certification Exam(s) ]
    NetworkAppliance [39 Certification Exam(s) ]
    NI [1 Certification Exam(s) ]
    NIELIT [1 Certification Exam(s) ]
    Nokia [6 Certification Exam(s) ]
    Nortel [130 Certification Exam(s) ]
    Novell [37 Certification Exam(s) ]
    OMG [10 Certification Exam(s) ]
    Oracle [282 Certification Exam(s) ]
    P&C [2 Certification Exam(s) ]
    Palo-Alto [4 Certification Exam(s) ]
    PARCC [1 Certification Exam(s) ]
    PayPal [1 Certification Exam(s) ]
    Pegasystems [12 Certification Exam(s) ]
    PEOPLECERT [4 Certification Exam(s) ]
    PMI [15 Certification Exam(s) ]
    Polycom [2 Certification Exam(s) ]
    PostgreSQL-CE [1 Certification Exam(s) ]
    Prince2 [6 Certification Exam(s) ]
    PRMIA [1 Certification Exam(s) ]
    PsychCorp [1 Certification Exam(s) ]
    PTCB [2 Certification Exam(s) ]
    QAI [1 Certification Exam(s) ]
    QlikView [1 Certification Exam(s) ]
    Quality-Assurance [7 Certification Exam(s) ]
    RACC [1 Certification Exam(s) ]
    Real Estate [1 Certification Exam(s) ]
    Real-Estate [1 Certification Exam(s) ]
    RedHat [8 Certification Exam(s) ]
    RES [5 Certification Exam(s) ]
    Riverbed [8 Certification Exam(s) ]
    RSA [15 Certification Exam(s) ]
    Sair [8 Certification Exam(s) ]
    Salesforce [5 Certification Exam(s) ]
    SANS [1 Certification Exam(s) ]
    SAP [98 Certification Exam(s) ]
    SASInstitute [15 Certification Exam(s) ]
    SAT [1 Certification Exam(s) ]
    SCO [10 Certification Exam(s) ]
    SCP [6 Certification Exam(s) ]
    SDI [3 Certification Exam(s) ]
    See-Beyond [1 Certification Exam(s) ]
    Siemens [1 Certification Exam(s) ]
    Snia [7 Certification Exam(s) ]
    SOA [15 Certification Exam(s) ]
    Social-Work-Board [4 Certification Exam(s) ]
    SpringSource [1 Certification Exam(s) ]
    SUN [63 Certification Exam(s) ]
    SUSE [1 Certification Exam(s) ]
    Sybase [17 Certification Exam(s) ]
    Symantec [135 Certification Exam(s) ]
    Teacher-Certification [4 Certification Exam(s) ]
    The-Open-Group [8 Certification Exam(s) ]
    TIA [3 Certification Exam(s) ]
    Tibco [18 Certification Exam(s) ]
    Trainers [3 Certification Exam(s) ]
    Trend [1 Certification Exam(s) ]
    TruSecure [1 Certification Exam(s) ]
    USMLE [1 Certification Exam(s) ]
    VCE [6 Certification Exam(s) ]
    Veeam [2 Certification Exam(s) ]
    Veritas [33 Certification Exam(s) ]
    Vmware [58 Certification Exam(s) ]
    Wonderlic [2 Certification Exam(s) ]
    Worldatwork [2 Certification Exam(s) ]
    XML-Master [3 Certification Exam(s) ]
    Zend [6 Certification Exam(s) ]

    References :

    Dropmark :
    Dropmark-Text :
    Blogspot :
    Wordpress : :

    Back to Main Page

    Killexams exams | Killexams certification | Pass4Sure questions and answers | Pass4sure | pass-guaratee | best test preparation | best training guides | examcollection | killexams | killexams review | killexams legit | kill example | kill example journalism | kill exams reviews | kill exam ripoff report | review | review quizlet | review login | review archives | review sheet | legitimate | legit | legitimacy | legitimation | legit check | legitimate program | legitimize | legitimate business | legitimate definition | legit site | legit online banking | legit website | legitimacy definition | pass 4 sure | pass for sure | p4s | pass4sure certification | pass4sure exam | IT certification | IT Exam | certification material provider | pass4sure login | pass4sure exams | pass4sure reviews | pass4sure aws | pass4sure security | pass4sure cisco | pass4sure coupon | pass4sure dumps | pass4sure cissp | pass4sure braindumps | pass4sure test | pass4sure torrent | pass4sure download | pass4surekey | pass4sure cap | pass4sure free | examsoft | examsoft login | exams | exams free | examsolutions | exams4pilots | examsoft download | exams questions | examslocal | exams practice | | | |