Search Results

Search found 85480 results on 3420 pages for 'change data capture'.

Page 25/3420 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • Jquery: change event not triggered in IE

    - by Kenneth
    Hi, I have some code updating a dropdownlist, and then I fire the "change" event manually. It works like it should in firefox, opera and so on, but not in Internet Explorer. Any idea why? Code attached below. $(".bringFraktvalgRadio").click(function() { var selectedValue = $(".bringFraktvalgRadio:checked").val(); $("#<%= dropDeliveryOption.ClientID %> option[value=" + selectedValue + "]").attr("selected", true); $("#<%= dropDeliveryOption.ClientID %>").trigger("change"); });

    Read the article

  • HTML Input on change of value

    - by arik-so
    Hello, I have an input tag. This tag does not have the autocomplete feature turned off, thus, one does not necessarily need to release a key to change the value of this field and focus anotehr one. My question is: how can I detect ANY value changes of this particular field, like e. g. <input onvaluechange="//do following..." /> The JavaScritp attribute onchange does not fire on change of value, only on changes like blur, focus, etc... Thanks in advance!

    Read the article

  • jquery dynamically added checkbox not working with change() function

    - by estern
    I dynamically load in a few li's that have a label and a checkbox in them to another visible ul. I set the checkboxes to be checked="checked" and i am trying to trigger an event to happen when i change these dynamically inserted checkboxes but nothing occurs. Here is the jquery: $(".otherProductCheckbox:checkbox").change( function(){ alert('test'); }); Here is the html for the dynamically added li's: <li class="otherProduct"><input type="checkbox" class="otherProductCheckbox radioCheck" checked="checked"/><label>Product Name</label></li> Any idea why i cant get the alert to happen when the checkbox changes its checked state?

    Read the article

  • Making Spring Data JPA work with DataNucleus (GAE) (Spring Boot)

    - by xybrek
    There are several hints that Spring Data works with Google App Engine like: http://tommysiu.blogspot.com/2014/01/spring-data-on-gae-part-1.html http://blog.eisele.net/2009/07/spring-300m3-on-google-appengine-with.html Much of the examples are not "Spring Boot" so I've been trying to retrofit things with it. However, I've been stuck with this error for days and days: [INFO] Caused by: java.lang.NullPointerException [INFO] at org.datanucleus.api.jpa.metamodel.SingularAttributeImpl.isVersion(SingularAttributeImpl.java:79) [INFO] at org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.findVersionAttribute(JpaMetamodelEntityInformation.java:102) [INFO] at org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.<init>(JpaMetamodelEntityInformation.java:79) [INFO] at org.springframework.data.jpa.repository.support.JpaEntityInformationSupport.getMetadata(JpaEntityInformationSupport.java:65) [INFO] at org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getEntityInformation(JpaRepositoryFactory.java:149) [INFO] at org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:88) [INFO] at org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:68) [INFO] at org.springframework.data.repository.core.support.RepositoryFactorySupport.getRepository(RepositoryFactorySupport.java:158) [INFO] at org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport.initAndReturn(RepositoryFactoryBeanSupport.java:224) [INFO] at org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport.afterPropertiesSet(RepositoryFactoryBeanSupport.java:210) [INFO] at org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean.afterPropertiesSet(JpaRepositoryFactoryBean.java:92) [INFO] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$6.run(AbstractAutowireCapableBeanFactory.java:1602) [INFO] at java.security.AccessController.doPrivileged(Native Method) [INFO] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1599) [INFO] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1549) [INFO] ... 40 more Where, I'm trying to use Spring Data JPA with DataNucleus/AppEngine: @Configuration @ComponentScan @EnableJpaRepositories @EnableTransactionManagement class JpaApplicationConfig { private static final Logger logger = Logger .getLogger(JpaApplicationConfig.class.getName()); @Bean public EntityManagerFactory entityManagerFactory() { logger.info("Loading Entity Manager..."); return Persistence .createEntityManagerFactory("transactions-optional"); } @Bean public PlatformTransactionManager transactionManager() { logger.info("Loading Transaction Manager..."); final JpaTransactionManager txManager = new JpaTransactionManager(); txManager.setEntityManagerFactory(entityManagerFactory()); return txManager; } } I've tested Persistence.createEntityManagerFactory("transactions-optional"); to see if the app can persist using this EMF, well, it does, so I am sure that this EMF works fine. The problem is the "wiring" up with the Spring Data JPA, can anybody help?

    Read the article

  • Parse and read data frame in C?

    - by user253656
    I am writing a program that reads the data from the serial port on Linux. The data are sent by another device with the following frame format: |start | Command | Data | CRC | End | |0x02 | 0x41 | (0-127 octets) | | 0x03| ---------------------------------------------------- The Data field contains 127 octets as shown and octet 1,2 contains one type of data; octet 3,4 contains another data. I need to get these data I know how to write and read data to and from a serial port in Linux, but it is just to write and read a simple string (like "ABD") My issue is that I do not know how to parse the data frame formatted as above so that I can: get the data in octet 1,2 in the Data field get the data in octet 3,4 in the Data field get the value in CRC field to check the consistency of the data Here the sample snip code that read and write a simple string from and to a serial port in Linux: int writeport(int fd, char *chars) { int len = strlen(chars); chars[len] = 0x0d; // stick a <CR> after the command chars[len+1] = 0x00; // terminate the string properly int n = write(fd, chars, strlen(chars)); if (n < 0) { fputs("write failed!\n", stderr); return 0; } return 1; } int readport(int fd, char *result) { int iIn = read(fd, result, 254); result[iIn-1] = 0x00; if (iIn < 0) { if (errno == EAGAIN) { printf("SERIAL EAGAIN ERROR\n"); return 0; } else { printf("SERIAL read error %d %s\n", errno, strerror(errno)); return 0; } } return 1; } Does anyone please have some ideas? Thanks all.

    Read the article

  • Pre-filtering and shaping OData feeds using WCF Data Services and the Entity Framework - Part 1

    - by rajbk
    The Open Data Protocol, referred to as OData, is a new data-sharing standard that breaks down silos and fosters an interoperative ecosystem for data consumers (clients) and producers (services) that is far more powerful than currently possible. It enables more applications to make sense of a broader set of data, and helps every data service and client add value to the whole ecosystem. WCF Data Services (previously known as ADO.NET Data Services), then, was the first Microsoft technology to support the Open Data Protocol in Visual Studio 2008 SP1. It provides developers with client libraries for .NET, Silverlight, AJAX, PHP and Java. Microsoft now also supports OData in SQL Server 2008 R2, Windows Azure Storage, Excel 2010 (through PowerPivot), and SharePoint 2010. Many other other applications in the works. * This post walks you through how to create an OData feed, define a shape for the data and pre-filter the data using Visual Studio 2010, WCF Data Services and the Entity Framework. A sample project is attached at the bottom of Part 2 of this post. Pre-filtering and shaping OData feeds using WCF Data Services and the Entity Framework - Part 2 Create the Web Application File –› New –› Project, Select “ASP.NET Empty Web Application” Add the Entity Data Model Right click on the Web Application in the Solution Explorer and select “Add New Item..” Select “ADO.NET Entity Data Model” under "Data”. Name the Model “Northwind” and click “Add”.   In the “Choose Model Contents”, select “Generate Model From Database” and click “Next”   Define a connection to your database containing the Northwind database in the next screen. We are going to expose the Products table through our OData feed. Select “Products” in the “Choose your Database Object” screen.   Click “Finish”. We are done creating our Entity Data Model. Save the Northwind.edmx file created. Add the WCF Data Service Right click on the Web Application in the Solution Explorer and select “Add New Item..” Select “WCF Data Service” from the list and call the service “DataService” (creative, huh?). Click “Add”.   Enable Access to the Data Service Open the DataService.svc.cs class. The class is well commented and instructs us on the next steps. public class DataService : DataService< /* TODO: put your data source class name here */ > { // This method is called only once to initialize service-wide policies. public static void InitializeService(DataServiceConfiguration config) { // TODO: set rules to indicate which entity sets and service operations are visible, updatable, etc. // Examples: // config.SetEntitySetAccessRule("MyEntityset", EntitySetRights.AllRead); // config.SetServiceOperationAccessRule("MyServiceOperation", ServiceOperationRights.All); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2; } } Replace the comment that starts with “/* TODO:” with “NorthwindEntities” (the entity container name of the Model we created earlier).  WCF Data Services is initially locked down by default, FTW! No data is exposed without you explicitly setting it. You have explicitly specify which Entity sets you wish to expose and what rights are allowed by using the SetEntitySetAccessRule. The SetServiceOperationAccessRule on the other hand sets rules for a specified operation. Let us define an access rule to expose the Products Entity we created earlier. We use the EnititySetRights.AllRead since we want to give read only access. Our modified code is shown below. public class DataService : DataService<NorthwindEntities> { public static void InitializeService(DataServiceConfiguration config) { config.SetEntitySetAccessRule("Products", EntitySetRights.AllRead); config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2; } } We are done setting up our ODataFeed! Compile your project. Right click on DataService.svc and select “View in Browser” to see the OData feed. To view the feed in IE, you must make sure that "Feed Reading View" is turned off. You set this under Tools -› Internet Options -› Content tab.   If you navigate to “Products”, you should see the Products feed. Note also that URIs are case sensitive. ie. Products work but products doesn’t.   Filtering our data OData has a set of system query operations you can use to perform common operations against data exposed by the model. For example, to see only Products in CategoryID 2, we can use the following request: /DataService.svc/Products?$filter=CategoryID eq 2 At the time of this writing, supported operations are $orderby, $top, $skip, $filter, $expand, $format†, $select, $inlinecount. Pre-filtering our data using Query Interceptors The Product feed currently returns all Products. We want to change that so that it contains only Products that have not been discontinued. WCF introduces the concept of interceptors which allows us to inject custom validation/policy logic into the request/response pipeline of a WCF data service. We will use a QueryInterceptor to pre-filter the data so that it returns only Products that are not discontinued. To create a QueryInterceptor, write a method that returns an Expression<Func<T, bool>> and mark it with the QueryInterceptor attribute as shown below. [QueryInterceptor("Products")] public Expression<Func<Product, bool>> OnReadProducts() { return o => o.Discontinued == false; } Viewing the feed after compilation will only show products that have not been discontinued. We also confirm this by looking at the WHERE clause in the SQL generated by the entity framework. SELECT [Extent1].[ProductID] AS [ProductID], ... ... [Extent1].[Discontinued] AS [Discontinued] FROM [dbo].[Products] AS [Extent1] WHERE 0 = [Extent1].[Discontinued] Other examples of Query/Change interceptors can be seen here including an example to filter data based on the identity of the authenticated user. We are done pre-filtering our data. In the next part of this post, we will see how to shape our data. Pre-filtering and shaping OData feeds using WCF Data Services and the Entity Framework - Part 2 Foot Notes * http://msdn.microsoft.com/en-us/data/aa937697.aspx † $format did not work for me. The way to get a Json response is to include the following in the  request header “Accept: application/json, text/javascript, */*” when making the request. This is easily done with most JavaScript libraries.

    Read the article

  • How to speed up this simple mysql query?

    - by Jim Thio
    The query is simple: SELECT TB.ID, TB.Latitude, TB.Longitude, 111151.29341326*SQRT(pow(-6.185-TB.Latitude,2)+pow(106.773-TB.Longitude,2)*cos(-6.185*0.017453292519943)*cos(TB.Latitude*0.017453292519943)) AS Distance FROM `tablebusiness` AS TB WHERE -6.2767668133836 < TB.Latitude AND TB.Latitude < -6.0932331866164 AND FoursquarePeopleCount >5 AND 106.68123318662 < TB.Longitude AND TB.Longitude <106.86476681338 ORDER BY Distance See, we just look at all business within a rectangle. 1.6 million rows. Within that small rectangle there are only 67,565 businesses. The structure of the table is 1 ID varchar(250) utf8_unicode_ci No None Change Change Drop Drop More Show more actions 2 Email varchar(400) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 3 InBuildingAddress varchar(400) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 4 Price int(10) Yes NULL Change Change Drop Drop More Show more actions 5 Street varchar(400) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 6 Title varchar(400) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 7 Website varchar(400) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 8 Zip varchar(400) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 9 Rating Star double Yes NULL Change Change Drop Drop More Show more actions 10 Rating Weight double Yes NULL Change Change Drop Drop More Show more actions 11 Latitude double Yes NULL Change Change Drop Drop More Show more actions 12 Longitude double Yes NULL Change Change Drop Drop More Show more actions 13 Building varchar(200) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 14 City varchar(100) utf8_unicode_ci No None Change Change Drop Drop More Show more actions 15 OpeningHour varchar(400) utf8_unicode_ci Yes NULL Change Change Drop Drop More Show more actions 16 TimeStamp timestamp on update CURRENT_TIMESTAMP No CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP Change Change Drop Drop More Show more actions 17 CountViews int(11) Yes NULL Change Change Drop Drop More Show more actions The indexes are: Edit Edit Drop Drop PRIMARY BTREE Yes No ID 1965990 A Edit Edit Drop Drop City BTREE No No City 131066 A Edit Edit Drop Drop Building BTREE No No Building 21 A YES Edit Edit Drop Drop OpeningHour BTREE No No OpeningHour (255) 21 A YES Edit Edit Drop Drop Email BTREE No No Email (255) 21 A YES Edit Edit Drop Drop InBuildingAddress BTREE No No InBuildingAddress (255) 21 A YES Edit Edit Drop Drop Price BTREE No No Price 21 A YES Edit Edit Drop Drop Street BTREE No No Street (255) 982995 A YES Edit Edit Drop Drop Title BTREE No No Title (255) 1965990 A YES Edit Edit Drop Drop Website BTREE No No Website (255) 491497 A YES Edit Edit Drop Drop Zip BTREE No No Zip (255) 178726 A YES Edit Edit Drop Drop Rating Star BTREE No No Rating Star 21 A YES Edit Edit Drop Drop Rating Weight BTREE No No Rating Weight 21 A YES Edit Edit Drop Drop Latitude BTREE No No Latitude 1965990 A YES Edit Edit Drop Drop Longitude BTREE No No Longitude 1965990 A YES The query took forever. I think there has to be something wrong there. Showing rows 0 - 29 ( 67,565 total, Query took 12.4767 sec)

    Read the article

  • Big Data Matters with ODI12c

    - by Madhu Nair
    contributed by Mike Eisterer On October 17th, 2013, Oracle announced the release of Oracle Data Integrator 12c (ODI12c).  This release signifies improvements to Oracle’s Data Integration portfolio of solutions, particularly Big Data integration. Why Big Data = Big Business Organizations are gaining greater insights and actionability through increased storage, processing and analytical benefits offered by Big Data solutions.  New technologies and frameworks like HDFS, NoSQL, Hive and MapReduce support these benefits now. As further data is collected, analytical requirements increase and the complexity of managing transformations and aggregations of data compounds and organizations are in need for scalable Data Integration solutions. ODI12c provides enterprise solutions for the movement, translation and transformation of information and data heterogeneously and in Big Data Environments through: The ability for existing ODI and SQL developers to leverage new Big Data technologies. A metadata focused approach for cataloging, defining and reusing Big Data technologies, mappings and process executions. Integration between many heterogeneous environments and technologies such as HDFS and Hive. Generation of Hive Query Language. Working with Big Data using Knowledge Modules  ODI12c provides developers with the ability to define sources and targets and visually develop mappings to effect the movement and transformation of data.  As the mappings are created, ODI12c leverages a rich library of prebuilt integrations, known as Knowledge Modules (KMs).  These KMs are contextual to the technologies and platforms to be integrated.  Steps and actions needed to manage the data integration are pre-built and configured within the KMs.  The Oracle Data Integrator Application Adapter for Hadoop provides a series of KMs, specifically designed to integrate with Big Data Technologies.  The Big Data KMs include: Check Knowledge Module Reverse Engineer Knowledge Module Hive Transform Knowledge Module Hive Control Append Knowledge Module File to Hive (LOAD DATA) Knowledge Module File-Hive to Oracle (OLH-OSCH) Knowledge Module  Nothing to beat an Example: To demonstrate the use of the KMs which are part of the ODI Application Adapter for Hadoop, a mapping may be defined to move data between files and Hive targets.  The mapping is defined by dragging the source and target into the mapping, performing the attribute (column) mapping (see Figure 1) and then selecting the KM which will govern the process.  In this mapping example, movie data is being moved from an HDFS source into a Hive table.  Some of the attributes, such as “CUSTID to custid”, have been mapped over. Figure 1  Defining the Mapping Before the proper KM can be assigned to define the technology for the mapping, it needs to be added to the ODI project.  The Big Data KMs have been made available to the project through the KM import process.   Generally, this is done prior to defining the mapping. Figure 2  Importing the Big Data Knowledge Modules Following the import, the KMs are available in the Designer Navigator. v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US ZH-TW X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Figure 3  The Project View in Designer, Showing Installed IKMs Once the KM is imported, it may be assigned to the mapping target.  This is done by selecting the Physical View of the mapping and examining the Properties of the Target.  In this case MOVIAPP_LOG_STAGE is the target of our mapping. Figure 4  Physical View of the Mapping and Assigning the Big Data Knowledge Module to the Target Alternative KMs may have been selected as well, providing flexibility and abstracting the logical mapping from the physical implementation.  Our mapping may be applied to other technologies as well. The mapping is now complete and is ready to run.  We will see more in a future blog about running a mapping to load Hive. To complete the quick ODI for Big Data Overview, let us take a closer look at what the IKM File to Hive is doing for us.  ODI provides differentiated capabilities by defining the process and steps which normally would have to be manually developed, tested and implemented into the KM.  As shown in figure 5, the KM is preparing the Hive session, managing the Hive tables, performing the initial load from HDFS and then performing the insert into Hive.  HDFS and Hive options are selected graphically, as shown in the properties in Figure 4. Figure 5  Process and Steps Managed by the KM What’s Next Big Data being the shape shifting business challenge it is is fast evolving into the deciding factor between market leaders and others. Now that an introduction to ODI and Big Data has been provided, look for additional blogs coming soon using the Knowledge Modules which make up the Oracle Data Integrator Application Adapter for Hadoop: Importing Big Data Metadata into ODI, Testing Data Stores and Loading Hive Targets Generating Transformations using Hive Query language Loading Oracle from Hadoop Sources For more information now, please visit the Oracle Data Integrator Application Adapter for Hadoop web site, http://www.oracle.com/us/products/middleware/data-integration/hadoop/overview/index.html Do not forget to tune in to the ODI12c Executive Launch webcast on the 12th to hear more about ODI12c and GG12c. Normal 0 false false false EN-US ZH-TW X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";}

    Read the article

  • Opera 12 est arrivé, capture vidéo en HTML5 et accélération graphique pour tout le navigateur

    Opera 12 est arrivé capture vidéo en HTML5 et accélération graphique pour tout le navigateur Edit de la partie sur WebGL Opera 12 est de sortie. On passera rapidement sur la possibilité de customiser plus facilement le navigateur et on soulignera plutôt une utilisation expérimentale de l'accélération matérielle pour le rendu des pages web (et pas simplement les animations) et même pour l'affichage de son UI. Les deux pour un surf et un fonctionnement beaucoup plus rapides. Cette nouvelle fonctionnalité doit être activée en mettant à 1 le paramètre opera:config#UserPrefs|EnableHardwareAcceleration [IMG]http://ftp-developpez.c...

    Read the article

  • Set and Verify the Retention Value for Change Data Capture

    - by AllenMWhite
    Last summer I set up Change Data Capture for a client to track changes to their application database to apply those changes to their data warehouse. The client had some issues a short while back and felt they needed to increase the retention period from the default 3 days to 5 days. I ran this query to make that change: sp_cdc_change_job @job_type='cleanup', @retention=7200 The value 7200 represents the number of minutes in a period of 5 days. All was well, but they recently asked how they can verify...(read more)

    Read the article

  • Set and Verify the Retention Value for Change Data Capture

    - by AllenMWhite
    Last summer I set up Change Data Capture for a client to track changes to their application database to apply those changes to their data warehouse. The client had some issues a short while back and felt they needed to increase the retention period from the default 3 days to 5 days. I ran this query to make that change: sp_cdc_change_job @job_type='cleanup', @retention=7200 The value 7200 represents the number of minutes in a period of 5 days. All was well, but they recently asked how they can verify...(read more)

    Read the article

  • lead capture apps

    - by Steve
    I'd like to build a lead capture website, where a visitor is required to enter their name and email address in order to receive a free report for download. Do you know of an app which will let me store captured information in a database, and present the download link to them after registering? I guess this can all be accomplished using a contact form script with database integration and jquery confirmation message.

    Read the article

  • Change Data Capture - SQL Server 2008

    Change Data Capture (CDC) records DML operations performed on SQL tables and makes records available with information regarding what changed and when the change happened in a simple way. Compare and sync databases with SQL Compare“SQL Compare is fast, extremely easy to use, full-featured and affordable. I wouldn't bother messing around with anything else.” Adam Machanic, SQL Server MVP. Download a 14-day free trial.

    Read the article

  • JavaScript Data Binding Frameworks

    - by dwahlin
    Data binding is where it’s at now days when it comes to building client-centric Web applications. Developers experienced with desktop frameworks like WPF or web frameworks like ASP.NET, Silverlight, or others are used to being able to take model objects containing data and bind them to UI controls quickly and easily. When moving to client-side Web development the data binding story hasn’t been great since neither HTML nor JavaScript natively support data binding. This means that you have to write code to place data in a control and write code to extract it. Although it’s certainly feasible to do it from scratch (many of us have done it this way for years), it’s definitely tedious and not exactly the best solution when it comes to maintenance and re-use. Over the last few years several different script libraries have been released to simply the process of binding data to HTML controls. In fact, the subject of data binding is becoming so popular that it seems like a new script library is being released nearly every week. Many of the libraries provide MVC/MVVM pattern support in client-side JavaScript apps and some even integrate directly with server frameworks like Node.js. Here’s a quick list of a few of the available libraries that support data binding (if you like any others please add a comment and I’ll try to keep the list updated): AngularJS MVC framework for data binding (although closely follows the MVVM pattern). Backbone.js MVC framework with support for models, key/value binding, custom events, and more. Derby Provides a real-time environment that runs in the browser an in Node.js. The library supports data binding and templates. Ember Provides support for templates that automatically update as data changes. JsViews Data binding framework that provides “interactive data-driven views built on top of JsRender templates”. jQXB Expression Binder Lightweight jQuery plugin that supports bi-directional data binding support. KnockoutJS MVVM framework with robust support for data binding. For an excellent look at using KnockoutJS check out John Papa’s course on Pluralsight. Meteor End to end framework that uses Node.js on the server and provides support for data binding on  the client. Simpli5 JavaScript framework that provides support for two-way data binding. WinRT with HTML5/JavaScript If you’re building Windows 8 applications using HTML5 and JavaScript there’s built-in support for data binding in the WinJS library.   I won’t have time to write about each of these frameworks, but in the next post I’m going to talk about my (current) favorite when it comes to client-side JavaScript data binding libraries which is AngularJS. AngularJS provides an extremely clean way – in my opinion - to extend HTML syntax to support data binding while keeping model objects (the objects that hold the data) free from custom framework method calls or other weirdness. While I’m writing up the next post, feel free to visit the AngularJS developer guide if you’d like additional details about the API and want to get started using it.

    Read the article

  • Protect Data and Save Money? Learn How Best-in-Class Organizations do Both

    - by roxana.bradescu
    Databases contain nearly two-thirds of the sensitive information that must be protected as part of any organization's overall approach to security, risk management, and compliance. Solutions for protecting data housed in databases vary from encrypting data at the application level to defense-in-depth protection of the database itself. So is there a difference? Absolutely! According to new research from the Aberdeen Group, Best-in-Class organizations experience fewer data breaches and audit deficiencies - at lower cost -- by deploying database security solutions. And the results are dramatic: Aberdeen found that organizations encrypting data within their databases achieved 30% fewer data breaches and 15% greater audit efficiency with 34% less total cost when compared to organizations encrypting data within applications. Join us for a live webcast with Derek Brink, Vice President and Research Fellow at the Aberdeen Group, next week to learn how your organization can become Best-in-Class.

    Read the article

  • Bad Data is Really the Monster

    - by Dain C. Hansen
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Bad Data is really the monster – is an article written by Bikram Sinha who I borrowed the title and the inspiration for this blog. Sinha writes: “Bad or missing data makes application systems fail when they process order-level data. One of the key items in the supply-chain industry is the product (aka SKU). Therefore, it becomes the most important data element to tie up multiple merchandising processes including purchase order allocation, stock movement, shipping notifications, and inventory details… Bad data can cause huge operational failures and cost millions of dollars in terms of time, resources, and money to clean up and validate data across multiple participating systems. Yes bad data really is the monster, so what do we do about it? Close our eyes and hope it stays in the closet? We’ve tacked this problem for some years now at Oracle, and with our latest introduction of Oracle Enterprise Data Quality along with our integrated Oracle Master Data Management products provides a complete, best-in-class answer to the bad data monster. What’s unique about it? Oracle Enterprise Data Quality also combines powerful data profiling, cleansing, matching, and monitoring capabilities while offering unparalleled ease of use. What makes it unique is that it has dedicated capabilities to address the distinct challenges of both customer and product data quality – [different monsters have different needs of course!]. And the ability to profile data is just as important to identify and measure poor quality data and identify new rules and requirements. Included are semantic and pattern-based recognition to accurately parse and standardize data that is poorly structured. Finally all of the data quality components are integrated with Oracle Master Data Management, including Oracle Customer Hub and Oracle Product Hub, as well as Oracle Data Integrator Enterprise Edition and Oracle CRM. Want to learn more? On Tuesday Nov 15th, I invite you to listen to our webcast on Reduce ERP consolidation risks with Oracle Master Data Management I’ll be joined by our partner iGate Patni and be talking about one specific way to deal with the bad data monster specifically around ERP consolidation. Look forward to seeing you there!

    Read the article

  • Where can I locate business data to use in my application?

    - by Aaron McIver
    This question talks about any and all free public raw data which appeared to have valuable pieces but nothing that really provides what I am looking for. Instead of using a socially defined listing of businesses (foursquare), I would like a business listing data set of registered businesses and associated addresses that could then be searchable based on location (coordinates). The critical need is that the data set should be filterable based on varying criteria (give me all restaurants, coffee shops, etc...). If the data is free that is great but anywhere that sells this type of data would also suffice. Infochimps looked like a possibility but perhaps something a bit more extensive exists. Where can I find a free or for fee data set of registered business that is filterable based on type of business and location?

    Read the article

  • How to change size of Android ScrollView

    - by user611923
    I have a layout like this (abstracted): Scrollview - fill_parent LinearLayout - wrap_content ImageView 1 - wrap_content ImageView 2 - ... ImageView 3 ... In the course of user interaction some images are replaced by larger or smaller ones, shortening or lenghtening the area to be scrolled. And that is the problem. The Scrollview (SV) does not know about the change, so either it scrolls over a lot of empty space at the bottom, or cuts off a picture or two at the bottom. Question 1: Can I somehow make the SV readapt to the changed hight of the LiearLayout (LL)? Question 2: I can obtain the current size through getHight on the LL. But supplyong it to the SV via changed LayoutParams does not work - of course, that would only change the height of the SV on the screen. Is there a way to put the changed height of the LL into the SV in the code, somehow? Question 3: I havn't tried it yet. Would creating the SV, LL and its children in code and then adding/removing children of changed size as required, make the SV adapt to the changes? And last question: Is there a better aüpproach, except using ListViews?

    Read the article

  • how to use javascript to change div backgroundColor

    - by lanqy
    The html below: <div id="catestory"> <div class="content"> <h2>some title here</h2> <p>some content here</p> </div> <div class="content"> <h2>some title here</h2> <p>some content here</p> </div> <div class="content"> <h2>some title here</h2> <p>some content here</p> </div> </div> when mouseover the content of div then it's backgroundColor and the h2(inside this div) backgroundColor change(just like the css :hover) I know this can use css (:hover) to do this in modern browser but IE6 does't work how to use javascript(not jquery or other js framework) to do this? Edit:how to change the h2 backgroundColor too

    Read the article

  • PHP driven site needs password change.

    - by Drea
    I have inherited a website that needs the password changed that accesses the database. I can see that there are two tables within the database but neither of them have username or password info. The previous web guy moved out of the country and can't be reached. I am not up-to-speed enough to figure this out. I have gone through all the files to try and find the answer but can't get it. It's hosted by goDaddy.com and I have changed the passwords there but it didn't change this login info. www.executivehomerents.com/cpanel <-this brings up the prompt for the username & password which I won't give out but the page only gives you 5 choices and none of them deal with changing the password. They are simply to change the data in the tables. If you go to: http://www.websitedatabases.com/ <=this is the company that the PHPMagic program was purchased from-- They have no contact number. Here is another page that might help: http://www.executivehomerents.com/cpanel/dbinput/setup.php I don't think I'll get an answer to this but it's worth a try... thanks.

    Read the article

  • Can't connect to smtp (postfix, dovecot) after making a change and trying to change it back

    - by UberBrainChild
    I am using postfix and dovecot along with zpanel and I tried enabling SSL and then turned it off as I did not have SSL configured yet and I realized it was a bit stupid at the time. I am using CentOS 6.4. I get the following error in the mail log. (I changed my host name to "myhostname" and my domain to "mydomain.com") Oct 20 01:49:06 myhostname postfix/smtpd[4714]: connect from mydomain.com[127.0.0.1] Oct 20 01:49:16 myhostname postfix/smtpd[4714]: fatal: no SASL authentication mechanisms Oct 20 01:49:17 myhostname postfix/master[4708]: warning: process /usr/libexec/postfix/smtpd pid 4714 exit status 1 Oct 20 01:49:17 amyhostname postfix/master[4708]: warning: /usr/libexec/postfix/smtpd: bad command startup -- throttling Reading on forums and similar questions I figured it was just a service that was not running or installed. However I can see that saslauthd is currently up and running on my system and restarting it does not help. Here is my postfix master.cf # # Postfix master process configuration file. For details on the format # of the file, see the Postfix master(5) manual page. # # ***** Unused items removed ***** # ========================================================================== # service type private unpriv chroot wakeup maxproc command + args # (yes) (yes) (yes) (never) (100) # ========================================================================== smtp inet n - n - - smtpd # -o content_filter=smtp-amavis:127.0.0.1:10024 # -o receive_override_options=no_address_mappings pickup fifo n - n 60 1 pickup submission inet n - - - - smtpd -o content_filter= -o receive_override_options=no_header_body_checks cleanup unix n - n - 0 cleanup qmgr fifo n - n 300 1 qmgr #qmgr fifo n - n 300 1 oqmgr tlsmgr unix - - n 1000? 1 tlsmgr rewrite unix - - n - - trivial-rewrite bounce unix - - n - 0 bounce defer unix - - n - 0 bounce trace unix - - n - 0 bounce verify unix - - n - 1 verify flush unix n - n 1000? 0 flush proxymap unix - - n - - proxymap smtp unix - - n - - smtp smtps inet n - - - - smtpd # When relaying mail as backup MX, disable fallback_relay to avoid MX loops relay unix - - n - - smtp -o fallback_relay= # -o smtp_helo_timeout=5 -o smtp_connect_timeout=5 showq unix n - n - - showq error unix - - n - - error discard unix - - n - - discard local unix - n n - - local virtual unix - n n - - virtual lmtp unix - - n - - lmtp anvil unix - - n - 1 anvil scache unix - - n - 1 scache # # ==================================================================== # Interfaces to non-Postfix software. Be sure to examine the manual # pages of the non-Postfix software to find out what options it wants. # ==================================================================== maildrop unix - n n - - pipe flags=DRhu user=vmail argv=/usr/local/bin/maildrop -d ${recipient} uucp unix - n n - - pipe flags=Fqhu user=uucp argv=uux -r -n -z -a$sender - $nexthop!rmail ($recipient) ifmail unix - n n - - pipe flags=F user=ftn argv=/usr/lib/ifmail/ifmail -r $nexthop ($recipient) bsmtp unix - n n - - pipe flags=Fq. user=foo argv=/usr/local/sbin/bsmtp -f $sender $nexthop $recipient # # spam/virus section # smtp-amavis unix - - y - 2 smtp -o smtp_data_done_timeout=1200 -o disable_dns_lookups=yes -o smtp_send_xforward_command=yes 127.0.0.1:10025 inet n - y - - smtpd -o content_filter= -o smtpd_helo_restrictions= -o smtpd_sender_restrictions= -o smtpd_recipient_restrictions=permit_mynetworks,reject -o mynetworks=127.0.0.0/8 -o smtpd_error_sleep_time=0 -o smtpd_soft_error_limit=1001 -o smtpd_hard_error_limit=1000 -o receive_override_options=no_header_body_checks -o smtpd_bind_address=127.0.0.1 -o smtpd_helo_required=no -o smtpd_client_restrictions= -o smtpd_restriction_classes= -o disable_vrfy_command=no -o strict_rfc821_envelopes=yes # # Dovecot LDA dovecot unix - n n - - pipe flags=DRhu user=vmail:mail argv=/usr/libexec/dovecot/deliver -d ${recipient} # # Vacation mail vacation unix - n n - - pipe flags=Rq user=vacation argv=/var/spool/vacation/vacation.pl -f ${sender} -- ${recipient} And here is dovecot ## ## Dovecot config file ## listen = * disable_plaintext_auth = no protocols = imap pop3 lmtp sieve auth_mechanisms = plain login passdb { driver = sql args = /etc/zpanel/configs/dovecot2/dovecot-mysql.conf } userdb { driver = sql } userdb { driver = sql args = /etc/zpanel/configs/dovecot2/dovecot-mysql.conf } mail_location = maildir:/var/zpanel/vmail/%d/%n first_valid_uid = 101 #last_valid_uid = 0 first_valid_gid = 12 #last_valid_gid = 0 #mail_plugins = mailbox_idle_check_interval = 30 secs maildir_copy_with_hardlinks = yes service imap-login { inet_listener imap { port = 143 } } service pop3-login { inet_listener pop3 { port = 110 } } service lmtp { unix_listener lmtp { #mode = 0666 } } service imap { vsz_limit = 256M } service pop3 { } service auth { unix_listener auth-userdb { mode = 0666 user = vmail group = mail } # Postfix smtp-auth unix_listener /var/spool/postfix/private/auth { mode = 0666 user = postfix group = postfix } } service auth-worker { } service dict { unix_listener dict { mode = 0666 user = vmail group = mail } } service managesieve-login { inet_listener sieve { port = 4190 } service_count = 1 process_min_avail = 0 vsz_limit = 64M } service managesieve { } lda_mailbox_autocreate = yes lda_mailbox_autosubscribe = yes protocol lda { mail_plugins = quota sieve postmaster_address = [email protected] } protocol imap { mail_plugins = quota imap_quota trash imap_client_workarounds = delay-newmail } lmtp_save_to_detail_mailbox = yes protocol lmtp { mail_plugins = quota sieve } protocol pop3 { mail_plugins = quota pop3_client_workarounds = outlook-no-nuls oe-ns-eoh } protocol sieve { managesieve_max_line_length = 65536 managesieve_implementation_string = Dovecot Pigeonhole managesieve_max_compile_errors = 5 } dict { quotadict = mysql:/etc/zpanel/configs/dovecot2/dovecot-dict-quota.conf } plugin { # quota = dict:User quota::proxy::quotadict quota = maildir:User quota acl = vfile:/etc/dovecot/acls trash = /etc/zpanel/configs/dovecot2/dovecot-trash.conf sieve_global_path = /var/zpanel/sieve/globalfilter.sieve sieve = ~/dovecot.sieve sieve_dir = ~/sieve sieve_global_dir = /var/zpanel/sieve/ #sieve_extensions = +notify +imapflags sieve_max_script_size = 1M #sieve_max_actions = 32 #sieve_max_redirects = 4 } log_path = /var/log/dovecot.log info_log_path = /var/log/dovecot-info.log debug_log_path = /var/log/dovecot-debug.log mail_debug=yes ssl = no Does anyone have any ideas or tips on what I can try to get this working? Thanks for all the help EDIT: Output of postconf -n alias_database = hash:/etc/aliases alias_maps = hash:/etc/aliases broken_sasl_auth_clients = yes command_directory = /usr/sbin config_directory = /etc/postfix daemon_directory = /usr/libexec/postfix debug_peer_level = 2 delay_warning_time = 4 disable_vrfy_command = yes html_directory = no inet_interfaces = all mail_owner = postfix mailq_path = /usr/bin/mailq.postfix manpage_directory = /usr/share/man mydestination = localhost.$mydomain, localhost mydomain = control.yourdomain.com myhostname = control.yourdomain.com mynetworks = all newaliases_path = /usr/bin/newaliases.postfix queue_directory = /var/spool/postfix readme_directory = /usr/share/doc/postfix-2.2.2/README_FILES recipient_delimiter = + relay_domains = proxy:mysql:/etc/zpanel/configs/postfix/mysql-relay_domains_maps.cf sample_directory = /usr/share/doc/postfix-2.2.2/samples sendmail_path = /usr/sbin/sendmail.postfix setgid_group = postdrop smtp_use_tls = no smtpd_client_restrictions = smtpd_data_restrictions = reject_unauth_pipelining smtpd_helo_required = yes smtpd_helo_restrictions = smtpd_recipient_restrictions = permit_sasl_authenticated, permit_mynetworks, reject_unauth_destination, reject_non_fqdn_sender, reject_non_fqdn_recipient, reject_unknown_recipient_domain smtpd_sasl_auth_enable = yes smtpd_sasl_local_domain = $myhostname smtpd_sasl_path = private/auth smtpd_sasl_security_options = noanonymous smtpd_sasl_type = dovecot smtpd_sender_restrictions = smtpd_use_tls = no soft_bounce = yes transport_maps = hash:/etc/postfix/transport unknown_local_recipient_reject_code = 550 virtual_alias_maps = proxy:mysql:/etc/zpanel/configs/postfix/mysql-virtual_alias_maps.cf, regexp:/etc/zpanel/configs/postfix/virtual_regexp virtual_gid_maps = static:12 virtual_mailbox_base = /var/zpanel/vmail virtual_mailbox_domains = proxy:mysql:/etc/zpanel/configs/postfix/mysql-virtual_domains_maps.cf virtual_mailbox_maps = proxy:mysql:/etc/zpanel/configs/postfix/mysql-virtual_mailbox_maps.cf virtual_minimum_uid = 101 virtual_transport = dovecot virtual_uid_maps = static:101

    Read the article

  • Linux, how to capture screen, and simulate mouse movements.

    - by 2di
    Hi All I need to capture screen (as print screen) in the way so I can access pixel color data, to do some image recognition, after that I will need to generate mouse events on the screen such as left click, drag and drop (moving mouse while button is pressed, and then release it). Once its done, image will be deleted. Note: I need to capture whole screen everything that user can see, and I need to simulate clicks outside window of my program (if it makes any difference) Spec: Linux ubuntu Language: C++ Performance is not very important,"print screen" function will be executed once every ~10 sec. Duration of the process can be up to 24 hours so method needs to be stable and memory leaks free (as usuall :) I was able to do in windows with win GDI and some windows events, but I'ev no idea how to do it in Linux. Thanks a lot

    Read the article

  • Software for capturing still image snapshots from video input

    - by cantabilesoftware
    What software is available that will let me watch video from a USB video capture device at full PAL/NTSC resolution and capture a still image at a button press? More info: I'm using a KaiserBaas USB capture device (http://www.kaiserbaas.com/kaiser-baas-product-page/converters/video-to-dvd-maker), and want to capture screen shots from a PS2 connected to it. I'd like to play the game at the full PS2 resolution and capture certain scenes. The device has a one-touch snapshot button on it, but I can't seem to make it work and there's no mention of it anywhere in the documentation. The device works and the video appears in Power Producer (the included software), but I think it previews it smaller than the full resolution. I could print screen it, but having to crop will be a pain.

    Read the article

  • Sybase PowerDesigner Change Many (Find/Replace/Convert) Data Item's Data Types

    - by Andy
    Hello, I have a relatively large Conceptual Data Model in PowerDesigner. After generating a Physical Data Model and seeing the DBMS data types, I need to update all of data types(NUMBER/TEXT) for each data item. I'd like to either do a find/replace within the Conceptual Data Model or somehow map to different data types when creating the Physical Data Model. Ex. Change the auto conversion of Text - Clob, to Text - NVARCHAR(20). Thanks!

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >