Search Results

Search found 996 results on 40 pages for 'compound eye'.

Page 30/40 | < Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >

  • Problem with Amiga 1200 accelerator board

    - by cc0
    I just recently walked past a dump, where in the corner of my eye I spotted something that looked like a huge keyboard. I went to take a closer look, and found out that it was an Amiga 1200 with a 030 accellerator board and scala dongle. Jackpot! So anyway; I dried it, cleaned it, it works, but the floppy was not powering on and same with the harddrive. I am using an old Amiga 1200 PSU that was making some strange high pitch noise when I tried to boot the amiga with the harddrive installed in it. I removed the harddrive and it booted fine with the PSU not emitting any detectable noise. However, when I have the 030 installed it sometimes reboots and shows a red "Software Error" screen. I tried removing the memory on the board, same effect. Sometimes it does not boot at all, just gives a black screen. Someone suggested the card had problems with 3.1 roms, but this amiga has only 3.0 roms installed. Does anyone have any apparent theories as to why it seems unstable? I don't have any other Amiga parts to cross-swap with to test a lot of things, so I'd really appreciate some sound input here so I'd know what to look for in order to try fix it. And merry Christmas everyone :]

    Read the article

  • Outbound ports to allow through firewall

    - by dunxd
    This question was asked before, but in a rather general way. I'm asking more specifically based on my current requirements. We have a number of remote offices made up of a bunch of PCs and an ASA 5505 which is used as firewall and VPN termination point. In the offices we share the internet connection with one or more other organisations over whom we have very little control, asides from the config on the ASAs. For a bunch of reasons I'd like to lock down these ASA 5505s to only allow outbound traffic to ports used by applications we know we need. I'm putting a standard config to roll out to all the ASAs, and if we need to open up ports for the other orgs we can do it on request. But I want to leave open the most commonly required ports so we can get up and running without waiting on other folks technical staff to get back. I plan to allow the following TCP ports to support commonly required resources: POP3 (110 and 995) HTTP (80 and 443) IMAP4 (143 and 993) SMTP (25 and and 465) The question really is, what other ports do I need to leave open to allow for "normal" working. I've seen UDP port 53 for DNS as one. Are there any others that would be worth opening up? Just to note - I'll also be setting up monitoring systems to keep an eye on the ports we do allow. Any of the above could be misused of course. We'll also back all this up with signed agreements. But I'm aiming for a technical solutions where I don't have to start out with the full requirements of everyone we share connections with. See also: outbound ports that are always open

    Read the article

  • Intermittent lockups, unable to diagnose in over a year

    - by Magsol
    Here's a real doosie; I may just give my firstborn child to whomever helps me solve this problem. In July 2008, I assembled what would be my desktop computer for graduate school. Here are the specs of the machine I built: Thermaltake 750W PSU Corsair Dominator 2x2GB 240-pin SDRAM Thermaltake Tower Asus P5K Deluxe Motherboard Intel Core 2 Quad Q9300 2.5GHz CPU 2 x GeForce 8600 GT WD Caviar Blue 640GB hard drive CD burner DVD burner Soon thereafter, I ordered a new motherboard (because I was an idiot; that first motherboard supported CrossFire, not SLI), an Asus P5N-D. I was originally running Windows XP SP3. Pretty much right into the start of the fall semester, my desktop would simply lock up after awhile. If my system was largely idling, it would be after 1-3 days. If was gaming, it often happened an hour or two into my gaming session, indicating a link to activity level. Here's where it started getting interesting. I started looking at the system temps. The CPU was warmer than it should have been (~60s C), so I purchased some more efficient cooling compound a way better cooler for it. Now it hardly goes over 40 C. Intel was even kind enough to swap it out for free, just to rule it out. Lockups continued. The graphics cards were also running pretty warm: about 60 C idling. Removing one of them seemed to improve stability a little bit...as in, it wouldn't lock up quite as frequently, but still always eventually locked up. But it didn't matter which card I used or removed, the lockups continued. I reverted back to the original motherboard, the P5K Deluxe. Lockups continued. I purchased an entirely new motherboard, eVGA's nForce 750i. Lockups continued. Ran memtest86+ over and over and over, with no errors. Even RMA'd the memory. Lockups continued. Replaced the PSU with a Corsair 750W PSU. Lockups continued. Tried disconnecting all IDE drives (HDDs are SATA). Lockups continued. Replaced both graphics cards with a single Radeon HD 4980. Average temps are now always around 50 C when idling, 60 C only when gaming. Lockups continued. Throughout the whole ordeal, the system has been upgraded from Windows XP SP3 to Vista 32-bit, to Vista 64-bit, and is now at Windows 7 64-bit. Lockups have occurred at every step along the way (each OS was in place for at least a few months before the next upgrade). Edit: By "upgrade" I mean clean install each time. In addition to those reformats, I have performed many, many other reformats of the system and a reinstall of whatever OS had been previously installed in an attempt to rectify this problem, to no avail./Edit When the system locks up, there's no blue screen, no reboot, no error message of any kind. It simply freezes in place until I hit the reset button. Very, very rarely, once Windows boots back up, the system informs me that Windows has recovered from an error, but it can never find the source aside from some piece of hardware. I've swapped out every component in this computer, and there are more fans in it than I care to count...though for the sake of completeness: top 80mm case fan (out) rear 80mm case fan (out) rear 120mm case fan (out) front 120mm case fan (in) side 250mm case fan (in) giant CPU fan on-board motherboard fan (the eVGA board) triple-fan memory setup (came with the memory) PSU internal fan another 120mm fan I stuck on the underside of the video card to keep hot air from collecting at the bottom of the case I'm truly out of ideas. ANY help at all would be oh-so-very GREATLY appreciated. Thank you!

    Read the article

  • LLBLGen Pro v3.1 released!

    - by FransBouma
    Yesterday we released LLBLGen Pro v3.1! Version 3.1 comes with new features and enhancements, which I'll describe briefly below. v3.1 is a free upgrade for v3.x licensees. What's new / changed? Designer Extensible Import system. An extensible import system has been added to the designer to import project data from external sources. Importers are plug-ins which import project meta-data (like entity definitions, mappings and relational model data) from an external source into the loaded project. In v3.1, an importer plug-in for importing project elements from existing LLBLGen Pro v3.x project files has been included. You can use this importer to create source projects from which you import parts of models to build your actual project with. Model-only relationships. In v3.1, relationships of the type 1:1, m:1 and 1:n can be marked as model-only. A model-only relationship isn't required to have a backing foreign key constraint in the relational model data. They're ideal for projects which have to work with relational databases where changes can't always be made or some relationships can't be added to (e.g. the ones which are important for the entity model, but are not allowed to be added to the relational model for some reason). Custom field ordering. Although fields in an entity definition don't really have an ordering, it can be important for some situations to have the entity fields in a given order, e.g. when you use compound primary keys. Field ordering can be defined using a pop-up dialog which can be opened through various ways, e.g. inside the project explorer, model view and entity editor. It can also be set automatically during refreshes based on new settings. Command line relational model data refresher tool, CliRefresher.exe. The command line refresh tool shipped with v2.6 is now available for v3.1 as well Navigation enhancements in various designer elements. It's now easier to find elements like entities, typed views etc. in the project explorer from editors, to navigate to related entities in the project explorer by right clicking a relationship, navigate to the super-type in the project explorer when right-clicking an entity and navigate to the sub-type in the project explorer when right-clicking a sub-type node in the project explorer. Minor visual enhancements / tweaks LLBLGen Pro Runtime Framework Entity creation is now up to 30% faster and takes 5% less memory. Creating an entity object has been optimized further by tweaks inside the framework to make instantiating an entity object up to 30% faster. It now also takes up to 5% less memory than in v3.0 Prefetch Path node merging is now up to 20-25% faster. Setting entity references required the creation of a new relationship object. As this relationship object is always used internally it could be cached (as it's used for syncing only). This increases performance by 20-25% in the merging functionality. Entity fetches are now up to 20% faster. A large number of tweaks have been applied to make entity fetches up to 20% faster than in v3.0. Full WCF RIA support. It's now possible to use your LLBLGen Pro runtime framework powered domain layer in a WCF RIA application using the VS.NET tools for WCF RIA services. WCF RIA services is a Microsoft technology for .NET 4 and typically used within silverlight applications. SQL Server DQE compatibility level is now per instance. (Usable in Adapter). It's now possible to set the compatibility level of the SQL Server Dynamic Query Engine (DQE) per instance of the DQE instead of the global setting it was before. The global setting is still available and is used as the default value for the compatibility level per-instance. You can use this to switch between CE Desktop and normal SQL Server compatibility per DataAccessAdapter instance. Support for COUNT_BIG aggregate function (SQL Server specific). The aggregate function COUNT_BIG has been added to the list of available aggregate functions to be used in the framework. Minor changes / tweaks I'm especially pleased with the import system, as that makes working with entity models a lot easier. The import system lets you import from another LLBLGen Pro v3 project any entity definition, mapping and / or meta-data like table definitions. This way you can build repository projects where you store model fragments, e.g. the building blocks for a customer-order system, a user credential model etc., any model you can think of. In most projects, you'll recognize that some parts of your new model look familiar. In these cases it would have been easier if you would have been able to import these parts from projects you had pre-created. With LLBLGen Pro v3.1 you can. For example, say you have an Oracle schema called CRM which contains the bread 'n' butter customer-order-product kind of model. You create an entity model from that schema and save it in a project file. Now you start working on another project for another customer and you have to use SQL Server. You also start using model-first development, so develop the entity model from scratch as there's no existing database. As this customer also requires some CRM like entity model, you import the entities from your saved Oracle project into this new SQL Server targeting project. Because you don't work with Oracle this time, you don't import the relational meta-data, just the entities, their relationships and possibly their inheritance hierarchies, if any. As they're now entities in your project you can change them a bit to match the new customer's requirements. This can save you a lot of time, because you can re-use pre-fab model fragments for new projects. In the example above there are no tables yet (as you work model first) so using the forward mapping capabilities of LLBLGen Pro v3 creates the tables, PK constraints, Unique Constraints and FK constraints for you. This way you can build a nice repository of model fragments which you can re-use in new projects.

    Read the article

  • OpenCV install problems on Studio 12.04 - broken dependencies

    - by Will
    I'm trying to follow the Ubuntu OpenCV documentation at OpenCV. The provided script has a line which executed for some time, taking away more packages than I expected (such as ubuntu-studio video); sudo apt-get -qq remove ffmpeg x264 libx264-dev When the script gets to the line below, it bombs; sudo apt-get -qq install libopencv-dev build-essential checkinstall cmake pkg-config yasm libtiff4-dev libjpeg-dev libjasper-dev libavcodec-dev libavformat-dev libswscale-dev libdc1394-22-dev libxine-dev libgstreamer0.10-dev libgstreamer-plugins-base0.10-dev libv4l-dev python-dev python-numpy libtbb-dev libqt4-dev libgtk2.0-dev libfaac-dev libmp3lame-dev libopencore-amrnb-dev libopencore-amrwb-dev libtheora-dev libvorbis-dev libxvidcore-dev x264 v4l-utils ffmpeg The error msg is; E: Unable to correct problems, you have held broken packages. I've since run Update-Manager, run sudo apt-get updates, rebooted, tried the above script line manually, and still no change. I've just run sudo apt-get install -f and nothing seemed to change. It did mention that some packages were no longer needed and could be removed by apt-get autoremove, so I ran that. It removed a number of packages, so I reran the install command above. Still same problem of held broken packages. I just ran sudo apt-get -u dist-upgrade Part of the response was; The following packages have been kept back: gstreamer0.10-ffmpeg I'm not sure what that means. I do know that it shows up in my Update-Manager and cannot be checked I then ran sudo dpkg --configure -a and then reran sudo apt-get -f install and the package was still not upgraded, though there was this very interesting comment; Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: gstreamer0.10-ffmpeg : Depends: libavcodec53 (< 5:0) but it is not going to be installed or libavcodec-extra-53 (< 5:0) but 5:0.7.2-1ubuntu1+codecs1~oneiric2 is to be installed E: Unable to correct problems, you have held broken packages. Then I ran sudo apt-get -u dist-upgrade It showed I had one held package, so I ran; sudo apt-get -o Debug::pkgProblemResolver=yes dist-upgrade It also exited without upgrading the package, so I ran; sudo apt-get remove --dry-run gstreamer0.10-ffmpeg:i386 And it gave me; *The following packages will be REMOVED: arista gstreamer0.10-ffmpeg 0 upgraded, 0 newly installed, 2 to remove and 0 not upgraded. Remv arista [0.9.7-3ubuntu1] Remv gstreamer0.10-ffmpeg [0.10.12-1ubuntu1]* But when I reran sudo apt-get -u dist-upgrade It showed the package was still there. *The following packages have been kept back: gstreamer0.10-ffmpeg 0 upgraded, 0 newly installed, 0 to remove and 1 not upgraded.* Update: Just went into Synaptic PM and completely removed gstreamer0.10-ffmpeg Reran sudo apt-get -u dist-upgrade And was told; 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. However, when I ran the original apt-get to install opencv (first code at the top of this question), it still gave me the same broken package errors. So I tried $ cat /etc/apt/sources.list # # deb cdrom:[Ubuntu-Studio 11.10 _Oneiric Ocelot_ - Release i386 (20111011.1)]/ oneiric main multiverse restricted universe # deb cdrom:[Ubuntu-Studio 11.10 _Oneiric Ocelot_ - Release i386 (20111011.1)]/ oneiric main multiverse restricted universe # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://us.archive.ubuntu.com/ubuntu/ precise main restricted deb-src http://us.archive.ubuntu.com/ubuntu/ precise main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://us.archive.ubuntu.com/ubuntu/ precise-updates main restricted deb-src http://us.archive.ubuntu.com/ubuntu/ precise-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://us.archive.ubuntu.com/ubuntu/ precise universe deb-src http://us.archive.ubuntu.com/ubuntu/ precise universe deb http://us.archive.ubuntu.com/ubuntu/ precise-updates universe deb-src http://us.archive.ubuntu.com/ubuntu/ precise-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://us.archive.ubuntu.com/ubuntu/ precise multiverse deb-src http://us.archive.ubuntu.com/ubuntu/ precise multiverse deb http://us.archive.ubuntu.com/ubuntu/ precise-updates multiverse deb-src http://us.archive.ubuntu.com/ubuntu/ precise-updates multiverse ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. deb http://security.ubuntu.com/ubuntu precise-security main restricted deb-src http://security.ubuntu.com/ubuntu precise-security main restricted deb http://security.ubuntu.com/ubuntu precise-security universe deb-src http://security.ubuntu.com/ubuntu precise-security universe deb http://security.ubuntu.com/ubuntu precise-security multiverse deb-src http://security.ubuntu.com/ubuntu precise-security multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu precise partner # deb-src http://archive.canonical.com/ubuntu oneiric partner ## Uncomment the following two lines to add software from Ubuntu's ## 'extras' repository. ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. # deb http://extras.ubuntu.com/ubuntu oneiric main # deb-src http://extras.ubuntu.com/ubuntu oneiric main # deb http://download.opensuse.org/repositories/home:/popinet/xUbuntu_11.04 ./ # disabled on upgrade to precise and then; $ cat /etc/apt/sources.list.d/* But I don't have enough reputation to post the results here (it says I need at least 10 reputation points to post more than 2 links), so I don't know how to provide the requested feedback. Then tried; $ sudo apt-get check [sudo] password for <abcd>: Reading package lists... Done Building dependency tree Reading state information... Done However, no resolution of the problem yet. What else do I need to do? Will an upgrade to Ubuntu Studio 13.xx solve this problem (or compound it)?

    Read the article

  • September Independent Oracle User Group (IOUG) Regional Events:

    - by Mandy Ho
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} September 5, 2012 – Denver, CO Oracle 11g Database Upgrade Seminar Join Roy Swonger, Senior Director of software development at Oracle to learn about upgrading to Oracle Database 11g. Topics include: All the required preparatory steps Database upgrade strategies Post-upgrade performance analysis Helpful tips and common pitfalls to watch out for http://www.oracle.com/webapps/events/ns/EventsDetail.jsp?p_eventId=152242&src=7598177&src=7598177&Act=4 September 6, 2012 – Salt Lake City, UT Fall Symposium 2012 Plan to join us for our annual fall event on Sept 6. They day will be filled with learning and networking with tracks focused on Applications, APEX, BI, Development and DBA Topics. This event is free for UTOUG members to attend, but please register. http://www.utoug.org/apex/f?p=972:2:6686308836668467::::P2_EVENT_ID:121 September 6, 2012 – Portland, OR Oracle’s Hands on Workshop Series focused on providing Defense-in-Depth Solutions to secure data at the source, reduce risk and simplify compliance The Oracle Database Security Workshop is a one-day hands-on session for IT Managers, IT Security Architects and Oracle DBAs who are looking for solutions to address their information protection, privacy, and accountability challenges within their Oracle database environment. Most security programs offered today fail toadequately address database security. Customers continue to be challenged tosecure information against loss and protect the integrity of sensitiveinformation like critical financial data, personally identifiable information(PII) and credit card data for PCI compliance. http://nwoug.org/content.aspx?page_id=87&club_id=165905&item_id=241082 September 11, 2012 – Montreal, QC APEXposed! For APEX aficionados – join ODTUG in Montreal, September 11-12 for APEXposed! Topics will include Dynamic Actions, Plug-ins, Tuning, and Building Mobile Apps. The cost is $399 US and early registration ends August 15th. For more information: http://www.odtugapextraining.com  September 11, 2012 – Philadelphia, PA Big Data & What are we still doing wrong with Tom Kyte Tom Kyte is a Senior Technical Architect in Oracle's Server Technology Division. Tom is the Tom behind the AskTom column in Oracle Magazine and is also the author of Expert Oracle Database Architecture (Apress, 2005/2009) among other books Abstract: Big Data The term "big data" draws a lot of attention, but behind the hype there's a simple story. For decades, companies have been making business decisions based on transactional data stored in relational databases. However, beyond that critical data is a potential treasure trove of less structured data: weblogs, social media, email, sensors, and photographs that can be mined for useful information. This presentation will take a look at what Big Data is and means - and Oracle's strategy for handling it Abstract: What are we still doing wrong? I've given many best practices presentations in the last 10 years. I've given many worst practices presentations in the last 10 years. I've seen some things change over the last ten years and many other things stay exactly the same. In this talk - we'll be taking a look at the good and the bad - what we do right and what we continue to do wrong over and over again. We'll look at why "Why" is probably the right initial answer to most any question. We'll look at how we get to "Know what we Know", and why that can be both a help and a hindrance. We'll peek at "Best Practices" and tie them into what I term "Worst Practices". In short, a talk on the good and the bad. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} http://ioug.itconvergence.com/pls/apex/f?p=207:27:3669516430980563::NO September 12, 2012- New York, NY NYOUG Fall General Meeting “Trends in Database Administration and Why the Future of Database Administration is the Vdba” http://www.nyoug.org/upcoming_events.htm#General_Meeting1 September 21, 2012 – Cleveland, OH Oracle Database 11g for Developers: What You need to know or Oracle Database 11g New Features for Developers Attendees are introduced to the new and improved features of Oracle 11g (both Oracle 11g R1 and Oracle 11g R2) that directly impact application development. Special emphasis is placed on features that reduce development time, make development simpler, improve performance, or speed deployment. Specific topics include: New SQL functions, virtual columns, result caching, XML improvements, pivot statements, JDBC improvements, and PL/SQL enhancements such as compound triggers. http://www.neooug.org/ September 24, 2012 – Ottawa, ON Introduction to Oracle Spatial The free Oracle Locator functionality, and the Oracle Spatial option which dramatically extends Locator, are very useful, but poorly understood capabilities of the database. In the afternoon we will extend into additional areas selected from: storage and performance; answering business problems with spatial queries; using Oracle Maps in OBIEE; an overview and capabilities of Oracle Topology; under the covers with GeoCoding. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif";} http://www.oug-ottawa.org/pls/htmldb/f?p=327:27:4209274028390246::NO

    Read the article

  • Coming to OpenWorld? A must attend session…

    - by Ruma Sanyal
    Normal 0 false false false EN-US X-NONE X-NONE NTT Docomo, Inc. is the predominant mobile phone operator in Japan. The name is officially an abbreviation of the phrase, "do communications over the mobile network", and is also from a compound word dokomo, meaning "everywhere" in Japanese.  Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} One of the most important of NTT Docomo’s systems is ALADIN, which is a nationwide operating system shared with its eight regional subsidiaries. ALADIN has five primary functions: customer management, phone number management, information processing and storage, sales information management, and credit investigation. To enhance cost efficiency and help ensure stable operation of ALADIN, NTT Docomo has employed Oracle WebLogic Server as a new application platform. Further information on this can be found here. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Last year at OpenWorld, NTT Docomo was honored as an Innovation Award Winner for: · Implementing real time sales and contract management system enabling all services requested by customers for immediate activations before customer leaves the Docomo store · A robust disaster recovery strategy, room to grow the business, and ability to move custom Java development to a platform with built in standards - WebLogic · Better performance, better reliability, better stability, and smooth migration v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Meet This Year's Most Impressive Innovators! This year we continue to honor customers for their most innovative and cutting-edge solutions using Oracle Fusion Middleware. Join us in celebrating award recipients’ great achievements and commitment to innovation.   Oracle Fusion Middleware: Meet This Year's Most Impressive Innovators Session ID: CON7029 Tuesday September 30, 2014 @ 5-5:45 pm (PST) Yerba Buena Center for the Arts  YBCA Theater (next to Moscone North) 700 Howard St., San Francisco, CA, 94103 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • I Know What I Did This Summer: Put Down Trex Decking

    - by thatjeffsmith
    If you’re wondering why I would bore everyone with my pictures and frequent status updates/tweets from the past week – it’s so I could document the process of refurbishing my deck, or what some would call a porch. When we go to take a vacation, buy a car, do anything – we also read personal blogs to get the real story. So, if you’re curious about what it takes to tackle this sort of project, read on. Skills/Equipment/Manpower We Possessed I took the old decking out by myself. I’m about 230 lbs, more than 6′ tall, and I’m pretty healthy. This took about 8 hours over two afternoons. Three of us put the deck back together. My wife has two engineering degrees. Her father also has two engineering degrees. Lots of brainpower available here. Also, her dad ran the public works department for a country for more than 20 years – so lots and lots of practical experience on hand. We had a compound mitre saw, a skilsaw, 2-3 crowbars, a framing hammer, 3 cordless drills, a corded drill, lots of sawhorses, a power sander, an angle grinder, a 10×10 Coleman canopy tent, a Ford F-150 pickup truck, outdoor speakers and lots of iTunes playlists, plenty of water and cold beer. Why We Did This Our deck was relatively young – it was built in 2005. However, the pressure treated boards must not have been adequately maintained before we bought the house. I had powerwashed the deck every other year and had it stained a few times. The boards just rotted. We’re going to be in the house for a long time, and we wanted something that would look nice and require little maintenance. More bad deck boards The deck boards were in bad shape Things We Learned The two most important things: The hidden fasteners have to be put in JUST right. Wedge them into the grooved board, then bend down the bit that is screwed down. We didn’t do this on the first board and couldn’t get the second board to fit nearly close enough. Watching the official TREX YouTube video helped immensely, and we should have watched that first. When pre-drilling holes for the boards that need screwed down – DO NOT pre-drill through the underlying framing wood. ONLY pre-drill through the TREX itself. The screw won’t seat in the board properly. Instead of sitting down flush with the board, it will stop at the top of the board and just spin. I had to call the the place that sold me the screws to find this out. So about a third of our screws look like crap. If it doesn’t look or feel right – stop everything and pick up your computer or your phone. It’s not right, and it will be much easier to stop and find out why. We didn’t do this, and now I’m going to see every screw that’s not flush with the boards and get upset. Oh well. The Process How much time did it take? Well I spent about 8 hours taking the deck apart. And then the 3 of use spent 8 hours the first day, 10 hours the second day, 8 hours the third, and another 6 hours on the fourth day. That’s like 104 man-hours. We supposedly saved four or five thousand dollars in labor, but don’t do the math here or you might get a bit upset. The main thing is that we got what we wanted, and there won’t be any surprises later. Now for some pictures… This 6”+ pry bar made the destruction of the old deck much easier Most of the joists, once exposed, were OK. This joist wasn’t sitting on ANYTHING before. We think a lazy gas person cut the board to sneak a gas line in. Awesome… These monster lag bolts had to be accounted for when putting in the additional framing The border pattern Sheri wanted to put in required a lot more framing. These were the first boards to go down – we screwed them in as there was no way to attach clips I sat, kicked in the boards, and then drilled these clips in – but my wife was able to go MUCH faster by using her hands to lock the boards in and drill on her knees. I liked locking the board in with my feet when they needed to be ‘encouraged’ to go straight. The first board took FOREVER to go in, but then when we got rolling, we were able to put in a 20′ board in less than 10 minutes. This was end of construction day #2 – we got much further than we thought we would. Ah, the dreaded last 10% – what to do here? Remember those ‘floating’ stringers? Yeah, we fixed that up a bit, too. My wife used a website (and her brain) to calculate exactly how to cut the stringers to give us the rise/run we needed with the proper clearance and all that jazz. The stairs with stringers and toe kicks – this was worth the effort It started raining on us as I screwed down the steps – this we managed to get our shade tent up on the deck to protect us from the rain too The stairs, finished Finished, mostly Good corner shot The top of the stairs Stairs, looking down Celebratory beer In Summary There are a few things we’re not happy with. I think we can fix them up – but later. I have a few things left to finish, rewire the lighting, get the gas grille put back in, and rehang some screen doors. I was expecting this to be a lot worse than it was. If I didn’t have the help, I would have never done it myself. But I’m glad that I did have that help and did do that project. It’s not often you get to spend that kind of qualify time with family and building cool stuff.

    Read the article

  • Silverlight 4 Released

    - by ScottGu
    The final release of Silverlight 4 is now available. What is in the Silverlight 4 Release Silverlight 4 contains a ton of new features and capabilities.  In particular we focused on three scenarios with this release: Further enhancing media support Building great business applications Enabling out of the browser experiences On Tuesday I gave a 60 minute keynote about Silverlight 4 which showed off many of the new features and capabilities now available.  You can watch my keynote to learn more about Silverlight 4 and see a ton of great demos of it in action. Also check out these three great posts by Tim Heuer that talk about the new features and provide a guide to the new Silverlight 4 capabilities: Silverlight 4 Beta – A Guide to the New Features Silverlight 4 RC – What was updated Silverlight 4 Released Also read David Anson’s great Silverlight 4 Toolkit post to learn more about the new controls and functionality also available within the Silverlight Toolkit release we also made available today.  Also visit this page to learn more about the new Pivot functionality in Silverlight 4 – which makes it really easy to visualize and interact with collections of images using Silverlight. Lastly – make sure to visit the www.silverlight.net web-site and visit the “Get Started” section to find free tutorials that you can use. Download and Install Silverlight 4 Tools for VS 2010 To develop Silverlight 4 applications you should first download and install Visual Studio 2010 or download and install the free Visual Web Developer 2010 Express edition. Then install the Silverlight Tools RC2 for Visual Studio 2010.  This setup includes the Silverlight 4 Developer Runtime, Silverlight 4 SDK, RIA Services, and VS 2010 tools support.  Once installed you can do File->New Project and choose Silverlight Application to create your first Silverlight 4 project.  You can then use the new WYSIWYG Silverlight designer in Visual Studio 2010 to design and build rich Silverlight 4 applications. Important: If you previously installed the Silverlight 4 Beta or RC build on your machine, please make sure to go into Add/Remove programs and uninstall the “Update for Visual Studio 2010 (KB976272)” package prior to installing the Silverlight Tools RC2 for Visual Studio 2010 setup.  Note that while Silverlight 4 is released, the “Silverlight 4 Tools for VS 2010” is currently in “RC2” mode (meaning we are going to keep an eye out for any remaining issues before finally calling it done).  We’ll update the tools to be “final” in a few weeks once we verify that no last minute issues/bugs remain. Download and Install Expression Blend 4 Release Candidate You can also download and install the Expression Blend 4 RC to create and design great Silverlight 4 applications.  Blend contains “Sketchflow” support – which makes it really easy to rapidly prototype ideas and applications.  To learn more about Sketchflow watch this 90 second video of it in action. Summary Today’s release is the fourth release of Silverlight that we’ve shipped in the last 2.5 years.  The team has done a great job of advancing it quickly and staying focused.  We think today’s Silverlight 4 release opens up a ton of new opportunities to build great solutions for both consumers and business scenarios.  We are looking forward to seeing what you build with it! Hope this helps, Scott

    Read the article

  • Cocos2d-xna memory management for WP8

    - by Arkiliknam
    I recently upgraded to VS2012 and try my in dev game out on the new WP8 emulators but was dismayed to find out the emulator now crashes and throws an out of memory exception during my sprite loading procedure (funnily, it still works in WP7 emulators and on my WP7). Regardless of whether the problem is the emulator or not, I want to get a clear understanding of how I should be managing memory in the game. My game consists of a character whom has 4 or more different animations. Each animation consists of 4 to 7 frames. On top of that, the character has up to 8 stackable visualization modifications (eg eye type, nose type, hair type, clothes type). Pre memory issue, I preloaded all textures for each animation frame and customization and created animate action out of them. The game then plays animations using the customizations applied to that current character. I re-looked at this implementation when I received the out of memory exceptions and have started playing with RenderTexture instead, so instead of pre loading all possible textures, it on loads textures needed for the character, renders them onto a single texture, from which the animation is built. This means the animations use 1/8th of the sprites they were before. I thought this would solve my issue, but it hasn't. Here's a snippet of my code: var characterTexture = CCRenderTexture.Create((int)width, (int)height); characterTexture.BeginWithClear(0, 0, 0, 0); // stamp a body onto my texture var bodySprite = MethodToCreateSpecificSprite(); bodySprite.Position = centerPoint; bodySprite.Visit(); bodySprite.Cleanup(); bodySprite = null; // stamp eyes, nose, mouth, clothes, etc... characterTexture.End(); As you can see, I'm calling CleanUp and setting the sprite to null in the hope of releasing the memory, though I don't believe this is the right way, nor does it seem to work... I also tried using SharedTextureCache to load textures before Stamping my texture out, and then clearing the SharedTextureCache with: CCTextureCache.SharedTextureCache.RemoveAllTextures(); But this didn't have an effect either. Any tips on what I'm not doing? I used VS to do a memory profile of the emulation causing the crash. Both WP7.1 and WP8 emulators peak at about 150mb of usage. WP8 crashes and throws an out of memory exception. Each customisation/frame is 15kb at the most. Lets say there are 8 layers of customisation = 120kb but I render then onto one texture which I would assume is only 15kb again. Each animation is 8 frames at the most. That's 15kb for 1 texture, or 960kb for 8 textures of customisation. There are 4 animation sets. That's 60Kb for 4 sets of 1 texture, or 3.75MB for 4 sets of 8 textures of customisation. So even if its storing every layer, its 3.75MB.... no where near the 150mb breaking point my profiler seems to suggest :( WP 7.1 Memory Profile (max 150MB) WP8 Memory Profile (max 150MB and crashes)

    Read the article

  • This Week in Geek History: Gmail Goes Public, Deep Blue Wins at Chess, and the Birth of Thomas Edison

    - by Jason Fitzpatrick
    Every week we bring you a snapshot of the week in Geek History. This week we’re taking a peek at the public release of Gmail, the first time a computer won against a chess champion, and the birth of prolific inventor Thomas Edison. Gmail Goes Public It’s hard to believe that Gmail has only been around for seven years and that for the first three years of its life it was invite only. In 2007 Gmail dropped the invite only requirement (although they would hold onto the “beta” tag for another two years) and opened its doors for anyone to grab a username @gmail. For what seemed like an entire epoch in internet history Gmail had the slickest web-based email around with constant innovations and features rolling out from Gmail Labs. Only in the last year or so have major overhauls at competitors like Hotmail and Yahoo! Mail brought other services up to speed. Can’t stand reading a Week in Geek History entry without a random fact? Here you go: gmail.com was originally owned by the Garfield franchise and ran a service that delivered Garfield comics to your email inbox. No, we’re not kidding. Deep Blue Proves Itself a Chess Master Deep Blue was a super computer constructed by IBM with the sole purpose of winning chess matches. In 2011 with the all seeing eye of Google and the amazing computational abilities of engines like Wolfram Alpha we simply take powerful computers immersed in our daily lives for granted. The 1996 match against reigning world chest champion Garry Kasparov where in Deep Blue held its own, but ultimately lost, in a  4-2 match shook a lot of people up. What did it mean if something that was considered such an elegant and quintessentially human endeavor such as chess was so easy for a machine? A series of upgrades helped Deep Blue outright win a match against Kasparov in 1997 (seen in the photo above). After the win Deep Blue was retired and disassembled. Parts of Deep Blue are housed in the National Museum of History and the Computer History Museum. Birth of Thomas Edison Thomas Alva Edison was one of the most prolific inventors in history and holds an astounding 1,093 US Patents. He is responsible for outright inventing or greatly refining major innovations in the history of world culture including the phonograph, the movie camera, the carbon microphone used in nearly every telephone well into the 1980s, batteries for electric cars (a notion we’d take over a century to take seriously), voting machines, and of course his enormous contribution to electric distribution systems. Despite the role of scientist and inventor being largely unglamorous, Thomas Edison and his tumultuous relationship with fellow inventor Nikola Tesla have been fodder for everything from books, to comics, to movies, and video games. Other Notable Moments from This Week in Geek History Although we only shine the spotlight on three interesting facts a week in our Geek History column, that doesn’t mean we don’t have space to highlight a few more in passing. This week in Geek History: 1971 – Apollo 14 returns to Earth after third Lunar mission. 1974 – Birth of Robot Chicken creator Seth Green. 1986 – Death of Dune creator Frank Herbert. Goodnight Dune. 1997 – Simpsons becomes longest running animated show on television. Have an interesting bit of geek trivia to share? Shoot us an email to [email protected] with “history” in the subject line and we’ll be sure to add it to our list of trivia. Latest Features How-To Geek ETC Here’s a Super Simple Trick to Defeating Fake Anti-Virus Malware How to Change the Default Application for Android Tasks Stop Believing TV’s Lies: The Real Truth About "Enhancing" Images The How-To Geek Valentine’s Day Gift Guide Inspire Geek Love with These Hilarious Geek Valentines RGB? CMYK? Alpha? What Are Image Channels and What Do They Mean? Clean Up Google Calendar’s Interface in Chrome and Iron The Rise and Fall of Kramerica? [Seinfeld Video] GNOME Shell 3 Live CDs for OpenSUSE and Fedora Available for Testing Picplz Offers Special FX, Sharing, and Backup of Your Smartphone Pics BUILD! An Epic LEGO Stop Motion Film [VIDEO] The Lingering Glow of Sunset over a Winter Landscape Wallpaper

    Read the article

  • What is Agile Modeling and why do I need it?

    What is Agile Modeling and why do I need it? Agile Modeling is an add-on to existing agile methodologies like Extreme programming (XP) and Rational Unified Process (RUP). Agile Modeling enables developers to develop a customized software development process that actually meets their current development needs and is flexible enough to adjust in the future. According to Scott Ambler, Agile Modeling consists of five core values that enable this methodology to be effective and light weight Agile Modeling Core Values: Communication Simplicity Feedback Courage Humility Communication is a key component to any successful project. Open communication between stakeholder and the development team is essential when developing new applications or maintaining legacy systems. Agile models promote communication amongst software development teams and stakeholders. Furthermore, Agile Models provide a common understanding of an application for members of a software development team allowing them to have a universal common point of reference. The use of simplicity in Agile Models enables the exploration of new ideas and concepts through the use of basic diagrams instead of investing the time in writing tens or hundreds of lines of code. Feedback in regards to application development is essential. Feedback allows a development team to confirm that the development path is on track. Agile Models allow for quick feedback from shareholders because minimal to no technical expertise is required to understand basic models. Courage is important because you need to make important decisions and be able to change direction by either discarding or refactoring your work when some of your decisions prove inadequate, according to Scott Ambler. As a member of a development team, we must admit that we do not know everything even though some of us think we do. This is where humility comes in to play. Everyone is a knowledge expert in their own specific domain. If you need help with your finances then you would consult an accountant. If you have a problem or are in need of help with a topic why would someone not consult with a subject expert? An effective approach is to assume that everyone involved with your project has equal value and therefore should be treated with respect. Agile Model Characteristics: Purposeful Understandable Sufficiently Accurate Sufficiently Consistent Sufficiently Detailed Provide Positive Value Simple as Possible Just Fulfill Basic Requirements According to Scott Ambler, Agile models are the most effective possible because the time that is invested in the model is just enough effort to complete the job. Furthermore, if a model isn’t good enough yet then additional effort can be invested to get more value out of the model. However if a model is good enough, for the current needs, or surpass the current needs, then any additional work done on the model would be a waste. It is important to remember that good enough is in the eye of the beholder, so this can be tough. In order for Agile Models to work effectively Active Stakeholder need to participation in the modeling process. Finally it is also very important to model with others, this allows for additionally input ensuring that all the shareholders needs are reflected in the models. How can Agile Models be incorporated in to our projects? Agile Models can be incorporated in to our project during the requirement gathering and design phases. As requirements are gathered the models should be updated to incorporate the new project details as they are defined and updated. Additionally, the Agile Models created during the requirement phase can be the bases for the models created during the design phase.  It is important to only add to the model when the changes fit within the agile model characteristics and they do not over complicate the design.

    Read the article

  • BI and EPM Landscape

    - by frank.buytendijk
    Most of my blog entries are not about Oracle products, and most of the latest entries are about topics such as IT strategy and enterprise architecture. However, given my background at Gartner, and at Hyperion, I still keep a close eye on what's happening in BI and EPM. One important reason is that I believe there is significant competitive value for organizations getting BI and EPM right. Davenport and Harris wrote a great book called "Competing on Analytics", in which they explain this in a very engaging and convincing way. At Oracle we have defined the concept of "management excellence" that outlines what organizations have to do to keep or create a competitive edge. It's not only in the business processes, but also in the management processes. Recently, Gartner published its 2009 market shares report for BI, Analytics, and Performance Management. Gartner identifies the same three segments that Oracle does: (1) CPM Suites (Oracle refers not to Corporate Performance Management, but Enterprise Performance Management), (2) BI Platform, and (3) Analytic Applications & Performance Management. According to Gartner, Oracle's share is increasing with revenue growing by more than 5%. Oracle currently holds the #2 market share position in the overall BI Software space based on total BI software revenue. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 Gartner has ranked Oracle as #1 in the CPM Suites worldwide sub-segment based on total BI software revenue, and Oracle is gaining share with revenue growing by more than 6% in 2009. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 The Analytic Applications & Performance Management subsegment is more fragmented. It has for instance a very large "Other Vendors" category. The largest player traditionally is SAS. Analytic Applications are often meant for very specific analytic needs in very specific industry sectors. According to Gartner, from the large vendors, again Oracle is the one who is gaining the most share - with total BI software revenue growth close to 15% in 2009. Source: Gartner Dataquest Market Share: Business Intelligence, Analytics and Performance Management Software, Worldwide, 2009; Dan Sommer and Bhavish Sood; Apr 2010 I believe this shows Oracle's integration strategy is working. In fact, integration actually is the innovation. BI and EPM have been silo technology platforms and application suites way too long. Management and measuring performance should be very closely linked to strategy execution, which is the domain of other business application areas such as CRM, ERP, and Supply Chain. BI and EPM are not about "making better decisions" anymore, but are part of a tangible action framework. Furthermore, organizations are getting more serious about ecosystem thinking. They do not evaluate single tools anymore for different application areas, but buy into a complete ecosystem of hardware, software and services. The best ecosystem is the one that offers the most options, in environments where the uncertainty is high and investments are hard to reverse. The key to successfully managing such an environment is middleware, and BI and EPM become increasingly middleware intensive. In fact, given the horizontal nature of BI and EPM, sitting on top of all business functions and applications, you could call them "upperware". Many are active in the BI and EPM space. Big players can offer a lot, but there are always many areas that are covered by specialty vendors. Oracle openly embraces those technologies within the ecosystem as well. Complete, open and integrated still accurately describes the Oracle product strategy. frank

    Read the article

  • SharePoint OCR image files indexing

    Introduction This article describes how to setup indexing of the image files (including TIFF, PDF, JPEG, BMP...) using OCR technology. The indexing described below utilizes Microsoft IFilter technology and as such is not specific to SharePoint, but can be used with any product that uses Microsoft indexing: Microsoft Search, Desktop search, SQL Server search, and through the plug-ins with Google desktop search. I however use it with Microsoft Windows SharePoint Services 2003. For those other products, the registration may need to be slightly different. Background  One of the projects I was working on required a storage of old documents scanned into PDF files. Then there was a separate team of people responsible for providing a tags for a search engine so those image documents could be found. The whole process was clumsy, labor intensive, and error prone. That was what started me on my exploration path. OCR The first search I fired was for the Open Source OCR products. Pretty quickly, I narrowed it down to TESSERACT (http://code.google.com/p/tesseract-ocr/). Tesseract is an orphaned brain child of HP that worked on it from 1985 to 1995. Then it was moved to the Open Source, and now if I understand it correctly, Google is working on it. With credentials like that, it's no wonder that Tesseract scores one of the highest marks on OCR recognition and accuracy. After downloading and struggling just a bit, I got Tesseract to work. The struggling part was that the home page claims that its base input format is a TIFF file. May be my TIFFs were bad, but I was able to get it to work only for BMP files. Image files conversion So now that I have an OCR that can convert BMP files into text, how do I get text out of the image PDF files? One more search, and I settled down on ImageMagic (http://www.imagemagick.org/). This is another wonderful Open Source utility that can convert any file into image. It did work out of the box, converting any TIFF files into bitmaps, but to get PDF files converted, it requires a GhostScript (http://mirror.cs.wisc.edu/pub/mirrors/ghost/GPL/gs864/gs864w32.exe). Dealing with text PDFs With that utility installed, I was cooking - I can convert any file (in particular PDF and TIFF) into bitmap, and then I can extract the text out of the bitmap. The only consideration was to somehow treat PDF files containing text differently - after all, OCR is very computation intensive and somewhat error prone even with perfect image quality and resolution. So another quick search, and I have a PDFTOTEXT (ftp://ftp.foolabs.com/pub/xpdf/xpdf-3.02pl4-win32.zip) - thank God for Open Source! With these guys, I can pull text out of PDF in an eye blink. However, I would get nothing for pure image PDFs, but I already have a solution for that! Batch process It took another 15 minutes to setup a batch script to automate the process: Check the file extension If file is a PDF file try to extract text out of it if there is more than certain amount of text in the file - done! if there is no text, convert first page into bitmap run OCR on the bitmap For any other file type, convert file into bitmap Run OCR on the bitmap Once you unzip the attached project, check out the bin\OCR.BAT file. It will create a temporary file in the directory where your source file is with the same name + the '.txt' extension.Continue span.fullpost {display:none;}

    Read the article

  • SQLAuthority News – Android Efficiency Tips and Tricks – Personal Technology Tip #003

    - by pinaldave
    I use my phone for lots of things.  I use it mainly to replace my tablet – I can e-mail, take and edit photos, and do almost everything I can do on a laptop with this phone.  And I am sure that there are many of you out there just like me.  I personally have a Galaxy S3, which uses the Android operating system, and I have decided to feature it as the third installment of my Technology Tips and Tricks series. 1) Shortcut to your favorite contacts on home screen Access your most-called contacts easily from your home screen by holding your finger on any empty spot on the home screen.  A menu will pop up that allows you to choose Shortcuts, and Contact.  You can scroll through your contact list and then just tap on the name of the person you want to be able to dial with a single click. 2) Keep track of your data usage Yes, we all should keep a close eye on our data usage, because it is very easy to go over our limits and then end up with a giant bill at the end of the month.  Never get surprised when you open that mobile phone envelope again.  Go to Settings, then Data Usage, and you can find a quick rundown of your usage, how much data each app uses, and you can even set alarms to let you know when you are nearing the limits.   Better yet, you can set the phone to stop using data when it reaches a certain limit. 3) Bring back Good Grammar We often hear proclamations about the downfall of written language, and how texting abbreviations, misspellings, and lack of punctuation are the root of all evil.  Well, we can show all those doomsdayers that all is not lost by bringing punctuation back to texting.  Usually we leave it off when we text because it takes too long to get to the screen with all the punctuation options.  But now you can hold down the period (or “full stop”) button and a list of all the commonly-used punctuation marks will pop right up. 4) Apps, Apps, Apps and Apps And finally, I cannot end an article about smart phones without including a list of my favorite apps.  Here are a list of my Top 10 Applications on my Android (not counting social media apps). Advanced Task Killer – Keeps my phone snappy by closing un-necessary apps WhatsApp - my favorite alternate to Text SMS Flipboard - my ‘timepass’ moments Skype – keeps me close to friends and family GoogleMaps - I am never lost because of this one thing Amazon Kindle – Books my best friends DropBox - My data always safe Pluralsight Player – Learning never stops for me Samsung Kies Air – Connecting Phone to Computer Chrome – Replacing default browser I have not included any social media applications in the above list, but you can be sure that I am linked to Twitter, Facebook, Google+, LinkedIn, and YouTube. Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: Best Practices, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Android, Personal Technology

    Read the article

  • Observations From The Corner of a Starbucks

    - by Chris Williams
    I’ve spent the last 3 days sitting in a Starbucks for 4-8 hours at a time. As a result, I’ve observed a lot of interesting behavior and people (most of whom were uninteresting themselves.) One of the things I’ve noticed is that most people don’t sit down. They come in, get their drink and go. The ones that do sit down, stay much longer than it takes to consume their drink. The drink is just an incidental purchase. Certainly not the reason they are here. Most of the people who sit also have laptops. Probably around 75%. Only a few have kids (with them) but the ones that do, have very small kids. Toddlers or younger. Of all the “campers” only a small percentage are wearing headphone, presumably because A) external noise doesn’t bother them or B) they aren’t working on anything important. My buddy George falls into category A, but he grew up in a house full of people. Silence freaks him out far more than noise. My brother and I, on the other hand, were both only children and don’t handle noisy distractions well. He needs it quiet (like a tomb) and I need music. Go figure… I can listen to Britney Spears mixed with Apoptygma Berzerk and Anthrax and crank out 30 pages, but if your toddler is banging his spoon on the table, you’re getting a dirty look… unless I have music, then all is right with the world. Anyway, enough about me. Most of the people who come in as a group are smiling when they enter. Half as many are smiling when they leave. People who come in alone typically aren’t smiling at all. The average age, over the last three days seems to be early 30s… with a couple of senior citizens and teenagers at either end of the curve. The teenagers almost never stay. They have better stuff to do on a nice day. The senior citizens are split nearly evenly between campers and in&outs. Most of the non-solo campers have 1 person with a laptop, while the other reads the paper or a book. Some campers bring multiple laptops… but only really look at one of them. This Starbucks has a drive through. The line is almost never more than 2-3 cars long but apparently a lot of the in&out people would rather come in and stand in line behind (up to) 5 people. The music in here sucks. My musical tastes can best be described as eclectic to bad, but I can still get work done (see above.) I find the music in this particular Starbucks to be discordant and jarring. At this Starbucks, the coffee lingo is apparently something that is meant to occur between employees only. The nice lady at the counter can handle orders in plain English and translate them to Baristaspeak (Baristese?) quite efficiently. If you order in Baristaspeak however, she will look confused and repeat your order back to you in plain English to confirm you actually meant what you said. Then she will say it in Baristaspeak to the lady making your drink. Nobody in this Starbucks (other than the Baristas) makes eye-contact… at least not with me. Of course that may be indicative of a separate issue. ;)

    Read the article

  • The Evolution Of C#

    - by Paulo Morgado
    The first release of C# (C# 1.0) was all about building a new language for managed code that appealed, mostly, to C++ and Java programmers. The second release (C# 2.0) was mostly about adding what wasn’t time to built into the 1.0 release. The main feature for this release was Generics. The third release (C# 3.0) was all about reducing the impedance mismatch between general purpose programming languages and databases. To achieve this goal, several functional programming features were added to the language and LINQ was born. Going forward, new trends are showing up in the industry and modern programming languages need to be more: Declarative With imperative languages, although having the eye on the what, programs need to focus on the how. This leads to over specification of the solution to the problem in hand, making next to impossible to the execution engine to be smart about the execution of the program and optimize it to run it more efficiently (given the hardware available, for example). Declarative languages, on the other hand, focus only on the what and leave the how to the execution engine. LINQ made C# more declarative by using higher level constructs like orderby and group by that give the execution engine a much better chance of optimizing the execution (by parallelizing it, for example). Concurrent Concurrency is hard and needs to be thought about and it’s very hard to shoehorn it into a programming language. Parallel.For (from the parallel extensions) looks like a parallel for because enough expressiveness has been built into C# 3.0 to allow this without having to commit to specific language syntax. Dynamic There was been lots of debate on which ones are the better programming languages: static or dynamic. The fact is that both have good qualities and users of both types of languages want to have it all. All these trends require a paradigm switch. C# is, in many ways, already a multi-paradigm language. It’s still very object oriented (class oriented as some might say) but it can be argued that C# 3.0 has become a functional programming language because it has all the cornerstones of what a functional programming language needs. Moving forward, will have even more. Besides the influence of these trends, there was a decision of co-evolution of the C# and Visual Basic programming languages. Since its inception, there was been some effort to position C# and Visual Basic against each other and to try to explain what should be done with each language or what kind of programmers use one or the other. Each language should be chosen based on the past experience and familiarity of the developer/team/project/company and not by particular features. In the past, every time a feature was added to one language, the users of the other wanted that feature too. Going forward, when a feature is added to one language, the other will work hard to add the same feature. This doesn’t mean that XML literals will be added to C# (because almost the same can be achieved with LINQ To XML), but Visual Basic will have auto-implemented properties. Most of these features require or are built on top of features of the .NET Framework and, the focus for C# 4.0 was on dynamic programming. Not just dynamic types but being able to talk with anything that isn’t a .NET class. Also introduced in C# 4.0 is co-variance and contra-variance for generic interfaces and delegates. Stay tuned for more on the new C# 4.0 features.

    Read the article

  • What’s the use of code reuse?

    - by Tony Davis
    All great developers write reusable code, don’t they? Well, maybe, but as with all statements regarding what “great” developers do or don’t do, it’s probably an over-simplification. A novice programmer, in particular, will encounter in the literature a general assumption of the importance of code reusability. They spend time worrying about DRY (don’t repeat yourself), moving logic into specific “helper” modules that they can then reuse, agonizing about the minutiae of the class structure, inheritance and interface design that will promote easy reuse. Unfortunately, writing code specifically for reuse often leads to complicated object hierarchies and inheritance models that are anything but reusable. If, instead, one strives to write simple code units that are highly maintainable and perform a single function, in a concise, isolated fashion then the potential for reuse simply “drops out” as a natural by-product. Programmers, of course, care about these principles, about encapsulation and clean interfaces that don’t expose inner workings and allow easy pluggability. This is great when it helps with the maintenance and development of code but how often, in practice, do we actually reuse our code? Most DBAs and database developers are familiar with the practical reasons for the limited opportunities to reuse database code and its potential downsides. However, surely elsewhere in our code base, reuse happens often. After all, we can all name examples, such as date/time handling modules, which if we write with enough care we can plug in to many places. I spoke to a developer just yesterday who looked me in the eye and told me that in 30+ years as a developer (a successful one, I’d add), he’d never once reused his own code. As I sat blinking in disbelief, he explained that, of course, he always thought he would reuse it. He’d often agonized over its design, certain that he was creating code of great significance that he and other generations would reuse, with grateful tears misting their eyes. In fact, it never happened. He had in his head, most of the algorithms he needed and would simply write the code from scratch each time, refining the algorithms and tailoring the code to meet the specific requirements. It was, he said, simply quicker to do that than dig out the old code, check it, correct the mistakes, and adapt it. Is this a common experience, or just a strange anomaly? Viewed in a certain light, building code with a focus on reusability seems to hark to a past age where people built cars and music systems with the idea that someone else could and would replace and reuse the parts. Technology advances so rapidly that the next time you need the “same” code, it’s likely a new technique, or a whole new language, has emerged in the meantime, better equipped to tackle the task. Maybe we should be less fearful of the idea that we could write code well suited to the system requirements, but with little regard for reuse potential, and then rewrite a better version from scratch the next time.

    Read the article

  • SQLAuthority News – Top 5 Latest Microsoft Certifications of 2013 – Guest Post

    - by Pinal Dave
    With the IT job market getting more and more competent by the day, certifications are a must for anyone who wishes to get a strong foothold in the industry. Microsoft community comes up with regular updates and enhancements in its existing products to keep up with the rapidly evolving requirements of the ICT industry. We bring you a list of five latest Microsoft certifications that you must consider acquiring this year. MCSE: SharePoint Learn all about Windows Server 2012 and Microsoft SharePoint 2013, which brings an advanced set of features to the fore in this latest version. It introduces new capabilities for business intelligence, social media, branding, search, identity management, mobile device among other features. Enjoy a great user experience with sharing and collaboration in community forum, within a pixel-perfect SharePoint website. Data connectivity and business intelligence tools allow users to process and access data, analyze reports, share and collaborate with each other more conveniently. Microsoft Specialist: Microsoft Project 2013 The only project management system that works seamlessly with other applications and cloud solutions of Microsoft, MS Project 2013 offers more than what meets the eye.  It provides for easier management and monitoring of projects so that users can ensure timely delivery while improving the productivity significantly. So keep all your projects on track and collaborate with your team like never before with this enhanced release! This one’s a must for all project managers. MCSE Messaging Another one of Microsoft gems is its messaging environment which has also launched the latest release Microsoft Exchange Server 2013. Messaging administrators can take up this training and validate their expertise in Unified Messaging, Exchange Online, PowerShell and Virtualization strategies, through MCSE Messaging certification in Exchange Server. If you wish to enhance productivity and data security of your organization while being flexible and extremely efficient, this is the right certification for you. MCSE Communication An enterprise can function optimally on the strength of its information flow and communication systems. With Lync Server 2013, you can introduce a whole new world of unified communications which consists of audio/video conferencing, dial-in, Persistent Chat, instant chat, and EDGE services in your organization. Utilize IT to serve and support business objectives by mastering this UC technology with this latest MCSE Communication course on using Microsoft Lync Server 2013. MCSE: SQL Server 2012 BI Platform The decision making process is largely influenced by underlying enterprise information used by the management for business intelligence. Therefore, a robust business intelligence platform that anchors enterprise IT and transform it to operational efficiencies is the need of the hour. SQL Server 2012 BI Platform certification helps professionals implement, manage and maintain a BI database infrastructure effectively. IT professionals with BI skills are highly sought after these days. MCSD: Windows Store Apps A Microsoft Certified Solutions Developer certification in Windows Store Apps validates your potential in designing interactive apps. Learn The Essentials of Developing Windows Store Apps using HTML5 and JavaScript and establish yourself as an ace developer capable of creating fast and fluid Metro style apps for Windows 8 that are accessible on a variety of devices. You can also go ahead and Learn Essentials of Developing Windows Store Apps using C# mode if you’re already familiar and working with C# programming language. Hence the developers are free to choose their own favorite development stream which opens doors for them to get ready for the latest and exciting application development platform called Windows store apps. Software developers with these skills are in great demand in the industry today. In order to continue being competitive in your respective fields, it is imperative that IT personnel update their knowledge on a regular basis. Certifications are a means to achieve this goal. Not considered to be an optional pre-requisite anymore, major IT certifications such as these are now essential to stay afloat in a cut-throat industry where technologies change on a daily basis. This blog is written by Aruneet Anand of Koenig Solutions. Koenig Solutions does training for all of the above courses. For more information, visit the website. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Microsoft Certifications

    Read the article

  • F# WPF Form &ndash; the basics

    - by MarkPearl
    I was listening to Dot Net Rocks show #560 about F# and during the podcast Richard Campbell brought up a good point with regards to F# and a GUI. In essence what I understood his point to be was that until one could write an end to end application in F#, it would be a hard sell to developers to take it on. In part I agree with him, while I am beginning to really enjoy learning F#, I can’t but help feel that I would be a lot further into the language if I could do my Windows Forms like I do in C# or VB.NET for the simple reason that in “playing” applications I spend the majority of the time in the UI layer… So I have been keeping my eye out for some examples of creating a WPF form in a F# project and came across Tim’s F# Twitter Stream Sample, which had exactly this…. of course he actually had a bit more than a basic form… but it was enough for me to scrap the insides and glean what I needed. So today I am going to make just the very basic WPF form with all the goodness of a XAML window. Getting Started First thing we need to do is create a new solution with a blank F# application project – I have made mine called FSharpWPF. Once you have the project created you will need to change the project type from a Console Application to a Windows Application. You do this by right clicking on the project file and going to its properties… Once that is done you will need to add the appropriate references. You do this by right clicking on the References in the Solution Explorer and clicking “Add Reference'”. You should add the appropriate .Net references below for WPF & XAMl to work. Once these references are added you then need to add your XAML file to the project. You can do this by adding a new item to the project of type xml and simply changing the file extension from xml to xaml. Once the xaml file has been added to the project you will need to add valid window XAML. Example of a very basic xaml file is shown below… <Window xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="F# WPF WPF Form" Height="350" Width="525"> <Grid> </Grid> </Window> Once your xaml file is done… you need to set the build action of the xaml file from “None” to “Resource” as depicted in the picture below. If you do not set this you will get an IOException error when running the completed project with a message along the lines of “Cannot locate resource ‘window.xaml’ You then need to tie everything up by putting the correct F# code in the Program.fs to load the xaml window. In the Program.fs put the following code… module Program open System open System.Collections.ObjectModel open System.IO open System.Windows open System.Windows.Controls open System.Windows.Markup [<STAThread>] [<EntryPoint>] let main(_) = let w = Application.LoadComponent(new System.Uri("/FSharpWPF;component/Window.xaml", System.UriKind.Relative)) :?> Window (new Application()).Run(w) Once all this is done you should be able to build and run your project. What you have done is created a WPF based window inside a FSharp project. It should look something like below…   Nothing to exciting, but sufficient to illustrate the very basic WPF form in F#. Hopefully in future posts I will build on this to expose button events etc.

    Read the article

  • SQLAuthority News – Android Efficiency Tips and Tricks – Personal Technology Tip

    - by pinaldave
    I use my phone for lots of things.  I use it mainly to replace my tablet – I can e-mail, take and edit photos, and do almost everything I can do on a laptop with this phone.  And I am sure that there are many of you out there just like me.  I personally have a Galaxy S3, which uses the Android operating system, and I have decided to feature it as the third installment of my Technology Tips and Tricks series. 1) Shortcut to your favorite contacts on home screen Access your most-called contacts easily from your home screen by holding your finger on any empty spot on the home screen.  A menu will pop up that allows you to choose Shortcuts, and Contact.  You can scroll through your contact list and then just tap on the name of the person you want to be able to dial with a single click. 2) Keep track of your data usage Yes, we all should keep a close eye on our data usage, because it is very easy to go over our limits and then end up with a giant bill at the end of the month.  Never get surprised when you open that mobile phone envelope again.  Go to Settings, then Data Usage, and you can find a quick rundown of your usage, how much data each app uses, and you can even set alarms to let you know when you are nearing the limits.   Better yet, you can set the phone to stop using data when it reaches a certain limit. 3) Bring back Good Grammar We often hear proclamations about the downfall of written language, and how texting abbreviations, misspellings, and lack of punctuation are the root of all evil.  Well, we can show all those doomsdayers that all is not lost by bringing punctuation back to texting.  Usually we leave it off when we text because it takes too long to get to the screen with all the punctuation options.  But now you can hold down the period (or “full stop”) button and a list of all the commonly-used punctuation marks will pop right up. 4) Apps, Apps, Apps and Apps And finally, I cannot end an article about smart phones without including a list of my favorite apps.  Here are a list of my Top 10 Applications on my Android (not counting social media apps). Advanced Task Killer – Keeps my phone snappy by closing un-necessary apps WhatsApp - my favorite alternate to Text SMS Flipboard - my ‘timepass’ moments Skype – keeps me close to friends and family GoogleMaps - I am never lost because of this one thing Amazon Kindle – Books my best friends DropBox - My data always safe Pluralsight Player – Learning never stops for me Samsung Kies Air – Connecting Phone to Computer Chrome – Replacing default browser I have not included any social media applications in the above list, but you can be sure that I am linked to Twitter, Facebook, Google+, LinkedIn, and YouTube. Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: Best Practices, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Android, Personal Technology

    Read the article

  • Monitoring the Application alongside SQL Server

    - by Tony Davis
    Sometimes, on Simple-Talk, it takes a while to spot strange and unexpected patterns of user activity, or small bugs. For example, one morning we spotted that an article’s comment count had leapt to 1485, but that only four were displayed. With some rooting around in Google Analytics, and the endlessly annoying Community Server admin-interface, we were able to work out that a few days previously the article had been subject to a spam attack and that the comment count was for some reason including both accepted and unaccepted comments (which in turn uncovered a bug in the SQL). This sort of incident made us a lot keener on monitoring Simple-talk website usage more effectively. However, the metrics we wanted are troublesome, because they are far too specific for Google Analytics to measure, and the SQL Server backend doesn’t keep sufficient information to enable us to plot trends. The latter could provide, for example, the total number of comments made on, or votes cast for, articles, over all time, but not the number that occur by hour over a set time. We lacked a baseline, in other words. We couldn’t alter the database, as it is a bought-in package. We had neither the resources nor inclination to build-in dedicated application monitoring. Possibly, we could investigate a third-party tool to do the job; but then it occurred to us that we were already using a monitoring tool (SQL Monitor) to keep an eye on the database. It stored data, made graphs and sent alerts. Could we get it to monitor some aspects of the application as well? Of course, SQL Monitor’s single purpose is to check and monitor SQL Server, over time, rather than to monitor applications that use SQL Server. However, how different is the business of gathering and plotting SQL Server Wait Stats, from gathering and plotting various aspects of user activity on the site? Not a lot, it turns out. The latest version allows us to write our own custom monitoring scripts, meaning that we could now monitor any metric in the application that returns an integer. It took little time to write a simple SQL Query that collects basic metrics of the total number of subscribers, votes cast, comments made, or views of articles, over time. The SQL Monitor database polls Simple-Talk every second or so in order to get the latest totals, and can then store and plot this information, or even correlate SQL Server usage to application usage. You can see the live data by visiting monitor.red-gate.com. Click the "Analysis" tab, and select one of the "Simple-talk:" entries in the "Show" box and an appropriate data range (e.g. last 30 days). It’s nascent, and we’re still working on it, but it’s already given us more confidence that we’ll spot quickly trends, bugs, or bursts of ‘abnormal’ activity. If there is a sudden rise in comments, we get an alert, and if it’s due to a spam attack, we can moderate or ban the perpetrator very quickly. We’ve often argued that a tool should perform a single job well rather than turn into a Swiss-army knife, but ironically we’ve rather appreciated being able to make best use of what’s there anyway for a slightly different purpose. Is this a good or common practice? What do you think? Cheers, Tony.

    Read the article

  • Exception Handling Frequency/Log Detail

    - by Cyborgx37
    I am working on a fairly complex .NET application that interacts with another application. Many single-line statements are possible culprits for throwing an Exception and there is often nothing I can do to check the state before executing them to prevent these Exceptions. The question is, based on best practices and seasoned experience, how frequently should I lace my code with try/catch blocks? I've listed three examples below, but I'm open to any advice. I'm really hoping to get some pros/cons of various approaches. I can certainly come up with some of my own (greater log granularity for the O-C approach, better performance for the Monolithic approach), so I'm looking for experience over opinion. EDIT: I should add that this application is a batch program. The only "recovery" necessary in most cases is to log the error, clean up gracefully, and quit. So this could be seen to be as much a question of log granularity as exception handling. In my mind's eye I can imagine good reasons for both, so I'm looking for some general advice to help me find an appropriate balance. Monolitich Approach class Program{ public static void Main(){ try{ Step1(); Step2(); Step3(); } catch (Exception e) { Log(e); } finally { CleanUp(); } } public static void Step1(){ ExternalApp.Dangerous1(); ExternalApp.Dangerous2(); } public static void Step2(){ ExternalApp.Dangerous3(); ExternalApp.Dangerous4(); } public static void Step3(){ ExternalApp.Dangerous5(); ExternalApp.Dangerous6(); } } Delegated Approach class Program{ public static void Main(){ try{ Step1(); Step2(); Step3(); } finally { CleanUp(); } } public static void Step1(){ try{ ExternalApp.Dangerous1(); ExternalApp.Dangerous2(); } catch (Exception e) { Log(e); throw; } } public static void Step2(){ try{ ExternalApp.Dangerous3(); ExternalApp.Dangerous4(); } catch (Exception e) { Log(e); throw; } } public static void Step3(){ try{ ExternalApp.Dangerous5(); ExternalApp.Dangerous6(); } catch (Exception e) { Log(e); throw; } } } Obsessive-Compulsive Approach class Program{ public static void Main(){ try{ Step1(); Step2(); Step3(); } finally { CleanUp(); } } public static void Step1(){ try{ ExternalApp.Dangerous1(); } catch (Exception e) { Log(e); throw; } try{ ExternalApp.Dangerous2(); } catch (Exception e) { Log(e); throw; } } public static void Step2(){ try{ ExternalApp.Dangerous3(); } catch (Exception e) { Log(e); throw; } try{ ExternalApp.Dangerous4(); } catch (Exception e) { Log(e); throw; } } public static void Step3(){ try{ ExternalApp.Dangerous5(); } catch (Exception e) { Log(e); throw; } try{ ExternalApp.Dangerous6(); } catch (Exception e) { Log(e); throw; } } } Other approaches welcomed and encouraged. Above are examples only.

    Read the article

  • Podcast Show Notes: Collaborate 10 Wrap-Up - Part 1

    - by Bob Rhubart
    OK, I know last week I promised you a program featuring Oracle ACE Directors Mike van Alst (IT-Eye) and Jordan Braunstein (TUSC) and The Definitive Guide to SOA: Oracle Service Bus author Jeff Davies. But things happen. In this case, what happened was Collaborate 10 in Las Vegas. Prior to the event I asked Oracle ACE Director and OAUG board member Floyd Teter to see if he could round up a couple of people at the event for an impromtu interview over Skype (I was here in Cleveland) to get their impressions of the event. Listen to Part 1 Floyd, armed with his brand new iPad, went above and beyond the call of duty. At the appointed hour, which turned out to be about hour after the close of Collaborate 10,  Floyd had gathered nine other people to join him in a meeting room somewhere in the Mandalay Bay Convention Center. Here’s the entire roster: Floyd Teter - Project Manager at Jet Propulsion Lab, OAUG Board Blog | Twitter | LinkedIn | Oracle Mix | Oracle ACE Profile Mark Rittman - EMEA Technical Director and Co-Founder, Rittman Mead,  ODTUG Board Blog | Twitter | LinkedIn | Oracle Mix | Oracle ACE Profile Chet Justice - OBI Consultant at BI Wizards Blog | Twitter | LinkedIn | Oracle Mix | Oracle ACE Profile Elke Phelps - Oracle Applications DBA at Humana, OAUG SIG Chair Blog | LinkedIn | Oracle Mix | Book | Oracle ACE Profile Paul Jackson - Oracle Applications DBA at Humana Blog | LinkedIn | Oracle Mix | Book Srini Chavali - Enterprise Database & Tools Leader at Cummins, Inc Blog | LinkedIn | Oracle Mix Dave Ferguson – President, Oracle Applications Users Group LinkedIn | OAUG Profile John King - Owner, King Training Resources Website | LinkedIn | Oracle Mix Gavyn Whyte - Project Portfolio Manager at iFactory Consulting Blog | Twitter | LinkedIn | Oracle Mix John Nicholson - Channels & Alliances at Greenlight Technologies Website | LinkedIn Big thanks to Floyd for assembling the panelists and handling the on-scene MC/hosting duties.  Listen to Part 1 On a technical note, this discussion was conducted over Skype, using Floyd’s iPad, placed in the middle of the table.  During the call the audio was fantastic – the iPad did a remarkable job. Sadly, the Technology Gods were not smiling on me that day. The audio set-up that I tested successfully before the call failed to deliver when we first connected – I could hear the folks in Vegas, but they couldn’t hear me. A frantic, last-minute adjustment appeared to have fixed that problem, and the audio in my headphones from both sides of the conversation was loud and clear.  It wasn’t until I listened to the playback that I realized that something was wrong. So the audio for Vegas side of the discussion has about the same fidelity as a cell phone. It’s listenable, but disappointing when compared to what it sounded like during the discussion. Still, this was a one shot deal, and the roster of panelists and the resulting conversation was too good and too much fun to scrap just because of an unfortunate technical glitch.   Part 2 of this Collaborate 10 Wrap-Up will run next week. After that, it’s back on track with the previously scheduled program. So stay tuned: RSS del.icio.us Tags: oracle,otn,collborate 10,c10,oracle ace program,archbeat,arch2arch,oaug,odtug,las vegas Technorati Tags: oracle,otn,collborate 10,c10,oracle ace program,archbeat,arch2arch,oaug,odtug,las vegas

    Read the article

  • Archiving SQLHelp tweets

    - by jamiet
    #SQLHelp is a Twitter hashtag that can be used by any Twitter user to get help from the SQL Server community. I think its fair to say that in its first year of being it has proved to be a very useful resource however Kendra Little (@kendra_little) made a very salient point yesterday when she tweeted: Is there a way to search the archives of #sqlhelp Trying to remember answer to a question I know I saw a couple months ago http://twitter.com/#!/Kendra_Little/status/15538234184441856 This highlights an inherent problem with Twitter’s search capability – it simply does not reach far enough back in time. I have made steps to remedy that situation by putting into place two initiatives to archive Tweets that contain the #sqlhelp hashtag. The Archivist http://archivist.visitmix.com/ is a free service that, quite simply, archives a history of tweets that contain a given search term by periodically polling Twitter’s search service with that search term and subsequently displaying a dashboard providing an aggregate view of those tweets for things like tweet volume over time, top users and top words (Archivist FAQ). I have set up an archive on The Archivist for “sqlhelp” which you can view at http://archivist.visitmix.com/jamiet/7. Here is a screenshot of the SQLHelp dashboard 36 minutes after I set it up: There is lots of good information in there, including the fact that Jonathan Kehayias (@SQLSarg) is the most active SQLHelp tweeter (I suspect as an answerer rather than a questioner ) and that SSIS has proven to be a rather (ahem) popular subject!! Datasift The Archivist has its uses though for our purposes it has a couple of downsides. For starters you cannot search through an archive (which is what Kendra was after) and nor can you export the contents of the archive for offline analysis. For those functions we need something a bit more heavyweight and for that I present to you Datasift. Datasift is a tool (currently an alpha release) that allows you to search for tweets and provide them through an object called a Datasift stream. That sounds very similar to normal Twitter search though it has one distinct advantage that other Twitter search tools do not – Datasift has access to Twitter’s Streaming API (aka the Twitter Firehose). In addition it has access to a lot of other rather nice features: It provides the Datasift API that allows you to consume the output of a Datasift stream in your tool of choice (bring on my favourite ultimate mashup tool J ) It has a query language (called Filtered Stream Definition Language – FSDL for short) A Datasift stream can consume (and filter) other Datasift streams Datasift can (and does) consume services other than Twitter If I refer to Datasift as “ETL for tweets” then you may get some sort of idea what it is all about. Just as I did with The Archivist I have set up a publicly available Datasift stream for “sqlhelp” at http://datasift.net/stream/1581/sqlhelp. Here is the FSDL query that provides the data: twitter.text contains "sqlhelp" Pretty simple eh? At the current time it provides little more than a rudimentary dashboard but as Datasift is currently an alpha release I think this may be worth keeping an eye on. The real value though is the ability to consume the output of a stream via Datasift’s RESTful API, observe: http://api.datasift.net/stream.xml?stream_identifier=c7015255f07e982afdeebdf1ae6e3c0d&username=jamiet&api_key=XXXXXXX (Note that an api_key is required during the alpha period so, given that I’m not supplying my api_key, this URI will not work for you) Just to prove that a Datasift stream can indeed consume data from another stream I have set up a second stream that further filters the first one for tweets containing “SSIS”. That one is at http://datasift.net/stream/1586/ssis-sqlhelp and here is the FSDL query: rule "414c9845685ff8d2548999cf3162e897" and (interaction.content contains "ssis") When Datasift moves beyond alpha I’ll re-assess how useful this is going to be and post a follow-up blog. @Jamiet

    Read the article

< Previous Page | 26 27 28 29 30 31 32 33 34 35 36 37  | Next Page >