Search Results

Search found 20 results on 1 pages for 'ananth p'.

Page 1/1 | 1 

  • nanoc installation setup in Linux

    - by Ananth
    I'm a newbie to Ruby. Trying to setup nanoc in my machine. I'm running Ubuntu 14.04. After the nanoc installation, when I type $nanoc --version I get the following errors: /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler/shared_helpers.rb:24:in `default_gemfile': Could not locate Gemfile (Bundler::GemfileNotFound) from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:248:in `default_gemfile' from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:192:in `root' from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:99:in `bundle_path' from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:400:in `configure_gem_home_and_path' from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:90:in `configure' from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:151:in `definition' from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:116:in `setup' from /home/ananth/.rvm/gems/ruby-head@global/gems/bundler-1.6.2/lib/bundler.rb:132:in `require' from /home/ananth/.rvm/gems/ruby-head/gems/nanoc-3.7.0/bin/nanoc:7:in `<top (required)>' from /home/ananth/.rvm/gems/ruby-head/bin/nanoc:23:in `load' from /home/ananth/.rvm/gems/ruby-head/bin/nanoc:23:in `<main>' from /home/ananth/.rvm/gems/ruby-head/bin/ruby_executable_hooks:15:in `eval' from /home/ananth/.rvm/gems/ruby-head/bin/ruby_executable_hooks:15:in `<main>' Am I missing something from my rvm? Anything to do with $PATH, .bashrc or .bash_profile? Thanks for the help in advance!

    Read the article

  • monitor http traffic from non-browser

    - by Ananth
    hi, I want to monitor http request generated out of a exe. Is there any tool that can help me? Actually, an exe would call my asp.net web page to register a user. The exe constructs the request with all the data in it. when the request reaches my web page, I don't see any data. I wan to monitor the Request object and the traffic to find the reality. Any help is appreciated. Thanks. Ananth

    Read the article

  • Is DiskDrive Signature unique?

    - by Ananth
    I am currently using WMI to query various details about the underlying hardware in order to uniquely identify a machine. To this end, I came across a field called "Signature" under "Win32_DiskDrive". Is this signature unique across machines (globally)? Can this be used reliably to identify the machine? Thanks, Ananth

    Read the article

  • monitor http request from non-browser

    - by Ananth
    hi, I want to monitor http request generated out of a exe. Is there any tool that can help me? Actually, an exe would call my asp.net web page to register a user. The exe constructs the POST data request and calls my page. when the request reaches my web page, I don't see any data. I want to monitor the Request object and the traffic to find the reality. Any help is appreciated. Thanks. ~/Ananth

    Read the article

  • ASP.Net Session data lost between pages

    - by Ananth
    Hi, i came across a weird behavior today w/ my web application. When I navigate from a page to another, I lose one particular session variable data. I'm able to launch the app in firefox and able to see that the session data is not lost. I use Response.Redirect(page2, false) to redirect to another page. Below code was used to track session variables System.IO.StreamWriter sw = new System.IO.StreamWriter(@"c:\test.txt", true); for (int i = 0; i < Session.Count; i++) { sw.WriteLine(Session.Keys[i] + " " + Session.Contents[i]); } sw.Close(); Can anyone help me in this? Any help is appreciated. ~/Ananth

    Read the article

  • Clear edged sprite

    - by Ananth
    I am a newbie to cocos2d. I would like make user to draw similar to what a painting brush would do. I am using CCSprite for that. I almost implemented the velocity, color and opacity factors for that tool, but I couldn't get the Sprite to be as clear as it should be. I can draw only in the below image http://i.imgur.com/KBe0L.png which has blunt edges. But I want it to be harder / clear outside edges as in http://i.stack.imgur.com/GrFlv.png. I am getting no idea to make it clear edged. The piece of code Im using is glEnable(GL_BLEND); [brush.texture setAliasTexParameters]; [brush setBlendFunc:(ccBlendFunc){GL_ONE, GL_ONE_MINUS_SRC_ALPHA}]; [brush visit]; I suspect the problem would be on blending mode. I tried some blending modes, but with no expected results. I am trying this for the past five days and so confused. Can some one help me sort this out? Thanks in advance.

    Read the article

  • Replicating between Cloud and On-Premises using Oracle GoldenGate

    - by Ananth R. Tiru
    Do you have applications running on the cloud that you need to connect with the on premises systems. The most likely answer to this question is an astounding YES!  If so, then you understand the importance of keep the data fresh at all times across the cloud and on-premises environments. This is also one of the key focus areas for the new GoldenGate 12c release which we announced couple of week ago via a press release. Most enterprises have spent years avoiding the data “silos” that inhibit productivity. For example, an enterprise which has adopted a CRM strategy could be relying on an on-premises based marketing application used for developing and nurturing leads. At the same time it could be using a SaaS based Sales application to create opportunities and quotes. The sales and the marketing teams which use these systems need to be able to access and share the data in a reliable and cohesive way. This example can be extended to other applications areas such as HR, Supply Chain, and Finance and the demands the users place on getting a consistent view of the data. When it comes to moving data in hybrid environments some of the key requirements include minimal latency, reliability and security: Data must remain fresh. As data ages it becomes less relevant and less valuable—day-old data is often insufficient in today’s competitive landscape. Reliability must be guaranteed despite system or connectivity issues that can occur between the cloud and on-premises instances. Security is a key concern when replicating between cloud and on-premises instances. There are several options to consider when replicating between the cloud and on-premises instances. Option 1 – Secured network established between the cloud and on-premises A secured network is established between the cloud and on-premises which enables the applications (including replication software) running on the cloud and on-premises to have seamless connectivity to other applications irrespective of where they are physically located. Option 2 – Restricted network established between the cloud and on-premises A restricted network is established between the cloud and on-premises instances which enable certain ports (required by replication) be opened on both the cloud and on the on-premises instances and white lists the IP addresses of the cloud and on-premises instances. Option 3 – Restricted network access from on-premises and cloud through HTTP proxy This option can be considered when the ports required by the applications (including replication software) are not open and the cloud instance is not white listed on the on-premises instance. This option of tunneling through HTTP proxy may be only considered when proper security exceptions are obtained. Oracle GoldenGate Oracle GoldenGate is used for major Fortune 500 companies and other industry leaders worldwide to support mission-critical systems for data availability and integration. Oracle GoldenGate addresses the requirements for ensuring data consistency between cloud and on-premises instances, thus facilitating the business process to run effectively and reliably. The architecture diagram below illustrates the scenario where the cloud and the on-premises instance are connected using GoldenGate through a secured network In the above scenario, Oracle GoldenGate is installed and configured on both the cloud and the on-premises instances. On the cloud instance Oracle GoldenGate is installed and configured on the machine where the database instance can be accessed. Oracle GoldenGate can be configured for unidirectional or bi-directional replication between the cloud and on premises instances. The specific configuration details of Oracle GoldenGate processes will depend upon the option selected for establishing connectivity between the cloud and on-premises instances. The knowledge article (ID - 1588484.1) titled ' Replicating between Cloud and On-Premises using Oracle GoldenGate' discusses in detail the options for replicating between the cloud and on-premises instances. The article can be found on My Oracle Support. To learn more about Oracle GoldenGate 12c register for our launch webcast where we will go into these new features in more detail.   You may also want to download our white paper "Oracle GoldenGate 12c Release 1 New Features Overview" I would love to hear your requirements for replicating between on-premises and cloud instances, as well as your comments about the strategy discussed in the knowledge article to address your needs. Please post your comments in this blog or in the Oracle GoldenGate public forum - https://forums.oracle.com/community/developer/english/business_intelligence/system_management_and_integration/goldengate

    Read the article

  • No DocBrokers are configured

    - by Ananth
    I get this error while trying to access documentum through webtop. "An error has occurred. [DM_DOCBROKER_E_NO_DOCBROKERS]error: "No DocBrokers are configured". Iam trying to access documentum for the first time. Any help ?

    Read the article

  • System.AccessViolationException: Attempted to read or write protected memory.

    - by Ananth
    I get the following exception when I try to "find and replace" in a word 2007 working on Windows Vista , Windows 7. System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. at Microsoft.Office.Interop.Word.Find.Execute(Object& FindText, Object& MatchCase, Object& MatchWholeWord, Object& MatchWildcards, Object& MatchSoundsLike, Object& MatchAllWordForms, Object& Forward, Object& Wrap, Object& Format, Object& ReplaceWith, Object& Replace, Object& MatchKashida, Object& MatchDiacritics, Object& MatchAlefHamza, Object& MatchControl) Is there any solution for this ? Iam using .net3.5 C#.

    Read the article

  • Lilypond: Customize bar lines, recursively, automatically?

    - by ananth.p
    I'm working on Carnatic music scores that involve complex time signatures, that will require modified bar lines Pattern for barlines for: 8/4 beats: 1 2 3 4 (dashed bar here) 5, 6 (Dotted Bar) 7, 8 (double bar) Here's one bar of actual score g16( f) d8 ees( ees) d16( c d8) bes16[( d c bes \bar "dashed" a g]) a[( bes c] d[ c d]) \bar ":" g8( f16) ees8( d16 c d) \bar "||" Is there a way to automate these barlines?

    Read the article

  • Block the user from changing the System Date/Time

    - by Ananth
    We have a windows application (C# .net) and we'll be giving installers to client. The requirement is that once the application has been installed , user should not be able to edit the system time/date . This is to make sure that the application generated dates/reports are not manipulated. My target OS is Win-XP What is the best way to do that ? Does OS provide any facility to do that ? Thanks in Advance

    Read the article

  • Unable to Export contents of Data table (with French formatted Numbers ) to XML

    - by Ananth
    I have a data Table with numbers formatted according to the current regional settings. ie ( in French decimal separators are ',' instead of '.' in English). I need to export it to XML. Numbers in XML needs to be formatted according to the current regional settings.But now numbers in XML are formatted in English.Is there any way to make the number formatting in XML according to current regional settings ( or based on the locale of the Data Table) during the exporting process ?

    Read the article

  • pointer delegate in STL set.

    - by ananth
    hi. Im kinda stuck with using a set with a pointer delegate. my code is as follows: void Graph::addNodes (NodeSet& nodes) { for (NodeSet::iterator pos = nodes.begin(); pos != nodes.end(); ++pos) { addNode(*pos); } } Here NodeSet is defined as: typedef std::set NodeSet; The above piece of code works perfectly on my windows machine, but when i run the same piece of code on a MAC, it gives me the following error: no matching function for call to 'Graph::addNode(const boost::shared_ptr&)' FYI, Node_ptr is of type: typedef boost::shared_ptr Node_ptr; can somebody plz tell me why this is happening?

    Read the article

  • OLL Live webcast - Using SQL for Pattern Matching in Oracle Database

    - by KLaker
    If you are interested in learning about our exciting new 12c SQL pattern matching feature then mark your diaries. On Wednesday, October 30th at 8:00 am (US/Pacific time zone) Supriya Ananth, who is one of our top curriculum developers at Oracle, will be hosting an OLL webcast on our new SQL pattern matching feature. The ability to recognize patterns in a sequence of rows has been a capability that was widely desired, but not possible with SQL until now. Row pattern matching in native SQL improves application and development productivity and query efficiency for row-sequence analysis. With Oracle Database 12c you can use the new MATCH_RECOGNIZE clause to perform pattern matching in SQL to do the following: Logically partition and order the data using the PARTITION BY and ORDER BY clauses Use regular expressions syntax to define patterns of rows to seek using the PATTERN clause. These patterns a powerful and expressive feature, applied to the pattern variables you define. Specify the logical conditions required to map a row to a row pattern variable in the DEFINE clause. Define measures, which are expressions usable in the MEASURES clause of the SQL query. For more information and to register for this exciting webcast please visit the OLL Live website, see here: https://apex.oracle.com/pls/apex/f?p=44785:145:116820049307135::::P145_EVENT_ID,P145_PREV_PAGE:461,143.  Please note - if the above link does not work then go to OLL (https://apex.oracle.com/pls/apex/f?p=44785:1:) and click the OLL Live icon (upper right, beneath the Login link or logout link if you are already logged in). The pattern matching webcast is listed on the calendar of events on 30 October.

    Read the article

  • GPGPU

    WhatGPU obviously stands for Graphics Processing Unit (the silicon powering the display you are using to read this blog post). The extra GP in front of that stands for General Purpose computing.So, altogether GPGPU refers to computing we can perform on GPU for purposes beyond just drawing on the screen. In effect, we can use a GPGPU a bit like we already use a CPU: to perform some calculation (that doesn’t have to have any visual element to it). The attraction is that a GPGPU can be orders of magnitude faster than a CPU.WhyWhen I was at the SuperComputing conference in Portland last November, GPGPUs were all the rage. A quick online search reveals many articles introducing the GPGPU topic. I'll just share 3 here: pcper (ignoring all pages except the first, it is a good consumer perspective), gizmodo (nice take using mostly layman terms) and vizworld (answering the question on "what's the big deal").The GPGPU programming paradigm (from a high level) is simple: in your CPU program you define functions (aka kernels) that take some input, can perform the costly operation and return the output. The kernels are the things that execute on the GPGPU leveraging its power (and hence execute faster than what they could on the CPU) while the host CPU program waits for the results or asynchronously performs other tasks.However, GPGPUs have different characteristics to CPUs which means they are suitable only for certain classes of problem (i.e. data parallel algorithms) and not for others (e.g. algorithms with branching or recursion or other complex flow control). You also pay a high cost for transferring the input data from the CPU to the GPU (and vice versa the results back to the CPU), so the computation itself has to be long enough to justify the overhead transfer costs. If your problem space fits the criteria then you probably want to check out this technology.HowSo where can you get a graphics card to start playing with all this? At the time of writing, the two main vendors ATI (owned by AMD) and NVIDIA are the obvious players in this industry. You can read about GPGPU on this AMD page and also on this NVIDIA page. NVIDIA's website also has a free chapter on the topic from the "GPU Gems" book: A Toolkit for Computation on GPUs.If you followed the links above, then you've already come across some of the choices of programming models that are available today. Essentially, AMD is offering their ATI Stream technology accessible via a language they call Brook+; NVIDIA offers their CUDA platform which is accessible from CUDA C. Choosing either of those locks you into the GPU vendor and hence your code cannot run on systems with cards from the other vendor (e.g. imagine if your CPU code would run on Intel chips but not AMD chips). Having said that, both vendors plan to support a new emerging standard called OpenCL, which theoretically means your kernels can execute on any GPU that supports it. To learn more about all of these there is a website: gpgpu.org. The caveat about that site is that (currently) it completely ignores the Microsoft approach, which I touch on next.On Windows, there is already a cross-GPU-vendor way of programming GPUs and that is the DirectX API. Specifically, on Windows Vista and Windows 7, the DirectX 11 API offers a dedicated subset of the API for GPGPU programming: DirectCompute. You use this API on the CPU side, to set up and execute the kernels that run on the GPU. The kernels are written in a language called HLSL (High Level Shader Language). You can use DirectCompute with HLSL to write a "compute shader", which is the term DirectX uses for what I've been referring to in this post as a "kernel". For a comprehensive collection of links about this (including tutorials, videos and samples) please see my blog post: DirectCompute.Note that there are many efforts to build even higher level languages on top of DirectX that aim to expose GPGPU programming to a wider audience by making it as easy as today's mainstream programming models. I'll mention here just two of those efforts: Accelerator from MSR and Brahma by Ananth. Comments about this post welcome at the original blog.

    Read the article

  • New Feature in ODI 11.1.1.6: ODI for Big Data

    - by Julien Testut
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} By Ananth Tirupattur Starting with Oracle Data Integrator 11.1.1.6.0, ODI is offering a solution to process Big Data. This post provides an overview of this feature. With all the buzz around Big Data and before getting into the details of ODI for Big Data, I will provide a brief introduction to Big Data and Oracle Solution for Big Data. So, what is Big Data? Big data includes: structured data (this includes data from relation data stores, xml data stores), semi-structured data (this includes data from weblogs) unstructured data (this includes data from text blob, images) Traditionally, business decisions are based on the information gathered from transactional data. For example, transactional Data from CRM applications is fed to a decision system for analysis and decision making. Products such as ODI play a key role in enabling decision systems. However, with the emergence of massive amounts of semi-structured and unstructured data it is important for decision system to include them in the analysis to achieve better decision making capability. While there is an abundance of opportunities for business for gaining competitive advantages, process of Big Data has challenges. The challenges of processing Big Data include: Volume of data Velocity of data - The high Rate at which data is generated Variety of data In order to address these challenges and convert them into opportunities, we would need an appropriate framework, platform and the right set of tools. Hadoop is an open source framework which is highly scalable, fault tolerant system, for storage and processing large amounts of data. Hadoop provides 2 key services, distributed and reliable storage called Hadoop Distributed File System or HDFS and a framework for parallel data processing called Map-Reduce. Innovations in Hadoop and its related technology continue to rapidly evolve, hence therefore, it is highly recommended to follow information on the web to keep up with latest information. Oracle's vision is to provide a comprehensive solution to address the challenges faced by Big Data. Oracle is providing the necessary Hardware, software and tools for processing Big Data Oracle solution includes: Big Data Appliance Oracle NoSQL Database Cloudera distribution for Hadoop Oracle R Enterprise- R is a statistical package which is very popular among data scientists. ODI solution for Big Data Oracle Loader for Hadoop for loading data from Hadoop to Oracle. Further details can be found here: http://www.oracle.com/us/products/database/big-data-appliance/overview/index.html ODI Solution for Big Data: ODI’s goal is to minimize the need to understand the complexity of Hadoop framework and simplify the adoption of processing Big Data seamlessly in an enterprise. ODI is providing the capabilities for an integrated architecture for processing Big Data. This includes capability to load data in to Hadoop, process data in Hadoop and load data from Hadoop into Oracle. ODI is expanding its support for Big Data by providing the following out of the box Knowledge Modules (KMs). IKM File to Hive (LOAD DATA).Load unstructured data from File (Local file system or HDFS ) into Hive IKM Hive Control AppendTransform and validate structured data on Hive IKM Hive TransformTransform unstructured data on Hive IKM File/Hive to Oracle (OLH)Load processed data in Hive to Oracle RKM HiveReverse engineer Hive tables to generate models Using the Loading KM you can map files (local and HDFS files) to the corresponding Hive tables. For example, you can map weblog files categorized by date into a corresponding partitioned Hive table schema. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Using the Hive control Append KM you can validate and transform data in Hive. In the below example, two source Hive tables are joined and mapped to a target Hive table. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} The Hive Transform KM facilitates processing of semi-structured data in Hive. In the below example, the data from weblog is processed using a Perl script and mapped to target Hive table. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Using the Oracle Loader for Hadoop (OLH) KM you can load data from Hive table or HDFS to a corresponding table in Oracle. OLH is available as a standalone product. ODI greatly enhances OLH capability by generating the configuration and mapping files for OLH based on the configuration provided in the interface and KM options. ODI seamlessly invokes OLH when executing the scenario. In the below example, a HDFS file is mapped to a table in Oracle. Development and Deployment:The following diagram illustrates the development and deployment of ODI solution for Big Data. Using the ODI Studio on your development machine create and develop ODI solution for processing Big Data by connecting to a MySQL DB or Oracle database on a BDA machine or Hadoop cluster. Schedule the ODI scenarios to be executed on the ODI agent deployed on the BDA machine or Hadoop cluster. ODI Solution for Big Data provides several exciting new capabilities to facilitate the adoption of Big Data in an enterprise. You can find more information about the Oracle Big Data connectors on OTN. You can find an overview of all the new features introduced in ODI 11.1.1.6 in the following document: ODI 11.1.1.6 New Features Overview

    Read the article

1