Search Results

Search found 62526 results on 2502 pages for 'data processing'.

Page 87/2502 | < Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >

  • SQL SERVER – 2008 – Introduction to Snapshot Database – Restore From Snapshot

    - by pinaldave
    Snapshot database is one of the most interesting concepts that I have used at some places recently. Here is a quick definition of the subject from Book On Line: A Database Snapshot is a read-only, static view of a database (the source database). Multiple snapshots can exist on a source database and can always reside on the same server instance as the database. Each database snapshot is consistent, in terms of transactions, with the source database as of the moment of the snapshot’s creation. A snapshot persists until it is explicitly dropped by the database owner. If you do not know how Snapshot database work, here is a quick note on the subject. However, please refer to the official description on Book-on-Line for accuracy. Snapshot database is a read-only database created from an original database called the “source database”. This database operates at page level. When Snapshot database is created, it is produced on sparse files; in fact, it does not occupy any space (or occupies very little space) in the Operating System. When any data page is modified in the source database, that data page is copied to Snapshot database, making the sparse file size increases. When an unmodified data page is read in the Snapshot database, it actually reads the pages of the original database. In other words, the changes that happen in the source database are reflected in the Snapshot database. Let us see a simple example of Snapshot. In the following exercise, we will do a few operations. Please note that this script is for demo purposes only- there are a few considerations of CPU, DISK I/O and memory, which will be discussed in the future posts. Create Snapshot Delete Data from Original DB Restore Data from Snapshot First, let us create the first Snapshot database and observe the sparse file details. USE master GO -- Create Regular Database CREATE DATABASE RegularDB GO USE RegularDB GO -- Populate Regular Database with Sample Table CREATE TABLE FirstTable (ID INT, Value VARCHAR(10)) INSERT INTO FirstTable VALUES(1, 'First'); INSERT INTO FirstTable VALUES(2, 'Second'); INSERT INTO FirstTable VALUES(3, 'Third'); GO -- Create Snapshot Database CREATE DATABASE SnapshotDB ON (Name ='RegularDB', FileName='c:\SSDB.ss1') AS SNAPSHOT OF RegularDB; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO Now let us see the resultset for the same. Now let us do delete something from the Original DB and check the same details we checked before. -- Delete from Regular Database DELETE FROM RegularDB.dbo.FirstTable; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO When we check the details of sparse file created by Snapshot database, we will find some interesting details. The details of Regular DB remain the same. It clearly shows that when we delete data from Regular/Source DB, it copies the data pages to Snapshot database. This is the reason why the size of the snapshot DB is increased. Now let us take this small exercise to  the next level and restore our deleted data from Snapshot DB to Original Source DB. -- Restore Data from Snapshot Database USE master GO RESTORE DATABASE RegularDB FROM DATABASE_SNAPSHOT = 'SnapshotDB'; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Clean up DROP DATABASE [SnapshotDB]; DROP DATABASE [RegularDB]; GO Now let us check the details of the select statement and we can see that we are successful able to restore the database from Snapshot Database. We can clearly see that this is a very useful feature in case you would encounter a good business that needs it. I would like to request the readers to suggest more details if they are using this feature in their business. Also, let me know if you think it can be potentially used to achieve any tasks. Complete Script of the afore- mentioned operation for easy reference is as follows: USE master GO -- Create Regular Database CREATE DATABASE RegularDB GO USE RegularDB GO -- Populate Regular Database with Sample Table CREATE TABLE FirstTable (ID INT, Value VARCHAR(10)) INSERT INTO FirstTable VALUES(1, 'First'); INSERT INTO FirstTable VALUES(2, 'Second'); INSERT INTO FirstTable VALUES(3, 'Third'); GO -- Create Snapshot Database CREATE DATABASE SnapshotDB ON (Name ='RegularDB', FileName='c:\SSDB.ss1') AS SNAPSHOT OF RegularDB; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Delete from Regular Database DELETE FROM RegularDB.dbo.FirstTable; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Restore Data from Snapshot Database USE master GO RESTORE DATABASE RegularDB FROM DATABASE_SNAPSHOT = 'SnapshotDB'; GO -- Select from Regular and Snapshot Database SELECT * FROM RegularDB.dbo.FirstTable; SELECT * FROM SnapshotDB.dbo.FirstTable; GO -- Clean up DROP DATABASE [SnapshotDB]; DROP DATABASE [RegularDB]; GO Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Extending QuickBooks Reporting with the QuickBooks ADO.NET Data Provider

    - by dataintegration
    The ADO.NET Provider for QuickBooks comes with several reports you may request from QuickBooks by default. However, there are many more that are not readily available. The ADO.NET Provider for QuickBooks makes it easy for you to create new reports and customize existing ones. In this article, we will illustrate how to create your own report and retrieve it from the Server Explorer in Visual Studio. For this example we will show how to create an Item Profitability Report. Creating the report script file Step 1: Download the sample reports available here. Extract them to a folder of your choice. Step 2: Make a copy of the ReportGeneralSummary.rsd file and rename it to ItemProfitability.rsd. Then open the file in any text editor. Step 3: Open the installation directory of the ADO.NET Provider for QuickBooks. Under the \db\ folder, locate the ReportJob.rsb file. Open this file in another text editor. Note: Although we are using ReportJob.rsb for this example, other reports may be contained in other Report*.rsb files. We recommend consulting the included help file and first locating the Report stored procedure and ReportType you are looking for. Otherwise, you may open each Report*.rsb file and look under the "reporttype" input for the report you are attempting to create. Step 4: First, let's rename the title of ItemProfitability.rsd. Near the top of the file you will see a title and description. Change the title to match the name of the file. Change the description to anything you like. For example: <rsb:info title="ItemProfitability" description="Executes my custom report."> Just below the Title, there are a number of columns. The Id represents the row number. The RowType represents the type of data returned by QuickBooks. The ColumnValue* columns represent all of the column data returned by QuickBooks. In some instances, we may need to add additional ColumnValue columns. Step 5: To add additional ColumnValue columns, simply copy the last column, paste it directly below, and continue increasing the numerical value at end of the attribute name. For example: <attr name="ColumnValue9" xs:type="string" readonly="true" required="false" desc="Represents a column of data."/> <attr name="ColumnValue10" xs:type="string" readonly="true" required="false" desc="Represents a column of data."/> <attr name="ColumnValue11" xs:type="string" readonly="true" required="false" desc="Represents a column of data."/> <attr name="ColumnValue12" xs:type="string" readonly="true" required="false" desc="Represents a column of data."/> ... Caution: Do not rename the ColumnValue* definitions themselves. They are generalized so that we can understand each type of report returned by QuickBooks. Renaming them to something other than ColumnValue* will cause your columns to return with null values. Step 6: Now let's update the available inputs for the table. From the ReportJob.rsb file, copy all of the input elements into ItemProfitability under the "Psuedo-Column definitions" comment. You will be replacing the existing input elements in ItemProfitability with inputs from ReportJob. When you are done, it should look like this: <!-- Psuedo-Column definitions --> <input name="reporttype" description="The type of the report." value="ITEMESTIMATESVSACTUALS,ITEMPROFITABILITY,JOBESTIMATESVSACTUALSDETAIL,JOBESTIMATESVSACTUALSSUMMARY,JOBPROFITABILITYDETAIL,JOBPROFITABILITYSUMMARY," default="ITEMESTIMATESVSACTUALS" /> <input name="reportperiod" description="Report date range in the format (fromdate:todate), and either value may be omitted for an open ended range (e.g. 2009-12-25:). Supported date format: yyyy-MM-dd." /> <input name="reportdaterangemacro" description="Use a predefined date range." value="ALL,TODAY,THISWEEK,THISWEEKTODATE,THISMONTH,THISMONTHTODATE,THISQUARTER,THISQUARTERTODATE,THISYEAR,THISYEARTODATE,YESTERDAY,LASTWEEK,LASTWEEKTODATE,LASTMONTH,LASTMONTHTODATE,LASTQUARTER,LASTQUARTERTODATE,LASTYEAR,LASTYEARTODATE,NEXTWEEK,NEXTFOURWEEKS,NEXTMONTH,NEXTQUARTER,NEXTYEAR," default="ALL" /> ... Step 7: Now let's update the operationname attribute. This needs to match the same operationname used by ReportJob. After you have copied the correct value from ReportJob.rsb, the operationname in ItemProfitability should look like so: <rsb:set attr="operationname" value="qbReportJob"/> Step 8: There is one more thing we can do to make this a true Item Profitability report. We can remove the reporttype input and hardcode the value. To do this, copy and paste the rsb:set used for operationname. Then rename the attr and value to match the name and value you want to use. For example: <rsb:set attr="operationname" value="qbReportJob"/> <rsb:set attr="reporttype" value="ITEMPROFITABILITY"/> After this you can remove the input for reporttype. Now that you have your own report file, we can move on to displaying the report in the Visual Studio server explorer. Accessing the report through the Data Provider Step 1: Open Visual Studio. In the Server Explorer, configure a new connection with the QuickBooks Data Provider. Step 2: For the Location connection string property, enter the directory where the new report has been saved to. Step 3: The new report should appear as a new view in the Server Explorer. Let's retrieve data from it. Step 4: You can specify any inputs in the WHERE clause. New Report Example Script To help you get started using this new QuickBooks Data Provider report, you will need to download the QuickBooks ADO.NET Data Provider and the fully functional sample script.

    Read the article

  • Customer Perspectives: Oracle Data Integrator

    - by Julien Testut
    Normal 0 false false false EN-US X-NONE X-NONE The Data Integration Product Management team will be hosting a customer panel session dedicated to Oracle Data Integrator at Oracle OpenWorld. I will have the pleasure to present this session with three of our customers: Paychex, Ross Stores and Turkcell. In this session, you will hear how Paychex, Ross Stores and Turkcell utilize Oracle Data Integrator to meet their IT and business needs. Our customers will be able to share with you how they use ODI in their environments, best practices, lessons learned and benefits of implementing Oracle Data Integrator. If you're interested in hearing more about how our customers use Oracle Data Integrator then I recommend attending this session: Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Customer Perspectives: Oracle Data Integrator Wednesday October, 3rd, 1:15PM - 2:15PM Marriott Marquis – Golden Gate C3 v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} The Data Integration track at OpenWorld covers variety of topics and speakers. In addition to product management of Oracle GoldenGate, Oracle Data Integrator, and Enteprise Data Quality presenting product updates and roadmap, we have several customer panels and stand-alone sessions featuring select customers such as St. Jude Medical, Raymond James, Aderas, Turkcell, Paychex, Comcast, Ticketmaster, Bank of America and more. You can see an overview of Data Integration sessions here.  Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} If you are not able to attend OpenWorld, please check out our latest resources for Data Integration and Oracle GoldenGate. In the coming weeks you will see more blogs about our products’ new capabilities and what to expect at OpenWorld. We hope to see you at OpenWorld and stay in touch via our future blogs. v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Configuring MySQL Cluster Data Nodes

    - by Mat Keep
    0 0 1 692 3948 Homework 32 9 4631 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} In my previous blog post, I discussed the enhanced performance and scalability delivered by extensions to the multi-threaded data nodes in MySQL Cluster 7.2. In this post, I’ll share best practices on the configuration of data nodes to achieve optimum performance on the latest generations of multi-core, multi-thread CPU designs. Configuring the Data Nodes The configuration of data node threads can be managed in two ways via the config.ini file: - Simply set MaxNoOfExecutionThreads to the appropriate number of threads to be run in the data node, based on the number of threads presented by the processors used in the host or VM. - Use the new ThreadConfig variable that enables users to configure both the number of each thread type to use and also which CPUs to bind them too. The flexible configuration afforded by the multi-threaded data node enhancements means that it is possible to optimise data nodes to use anything from a single CPU/thread up to a 48 CPU/thread server. Co-locating the MySQL Server with a single data node can fully utilize servers with 64 – 80 CPU/threads. It is also possible to co-locate multiple data nodes per server, but this is now only required for very large servers with 4+ CPU sockets dense multi-core processors. 24 Threads and Beyond! An example of how to make best use of a 24 CPU/thread server box is to configure the following: - 8 ldm threads - 4 tc threads - 3 recv threads - 3 send threads - 1 rep thread for asynchronous replication. Each of those threads should be bound to a CPU. It is possible to bind the main thread (schema management domain) and the IO threads to the same CPU in most installations. In the configuration above, we have bound threads to 20 different CPUs. We should also protect these 20 CPUs from interrupts by using the IRQBALANCE_BANNED_CPUS configuration variable in /etc/sysconfig/irqbalance and setting it to 0x0FFFFF. The reason for doing this is that MySQL Cluster generates a lot of interrupt and OS kernel processing, and so it is recommended to separate activity across CPUs to ensure conflicts with the MySQL Cluster threads are eliminated. When booting a Linux kernel it is also possible to provide an option isolcpus=0-19 in grub.conf. The result is that the Linux scheduler won't use these CPUs for any task. Only by using CPU affinity syscalls can a process be made to run on those CPUs. By using this approach, together with binding MySQL Cluster threads to specific CPUs and banning CPUs IRQ processing on these tasks, a very stable performance environment is created for a MySQL Cluster data node. On a 32 CPU/Thread server: - Increase the number of ldm threads to 12 - Increase tc threads to 6 - Provide 2 more CPUs for the OS and interrupts. - The number of send and receive threads should, in most cases, still be sufficient. On a 40 CPU/Thread server, increase ldm threads to 16, tc threads to 8 and increment send and receive threads to 4. On a 48 CPU/Thread server it is possible to optimize further by using: - 12 tc threads - 2 more CPUs for the OS and interrupts - Avoid using IO threads and main thread on same CPU - Add 1 more receive thread. Summary As both this and the previous post seek to demonstrate, the multi-threaded data node extensions not only serve to increase performance of MySQL Cluster, they also enable users to achieve significantly improved levels of utilization from current and future generations of massively multi-core, multi-thread processor designs. A big thanks to Mikael Ronstrom, Senior MySQL Architect at Oracle, for his work in developing these enhancements and best practices. You can download MySQL Cluster 7.2 today and try out all of these enhancements. The Getting Started guides are an invaluable aid to quickly building a Proof of Concept Don’t forget to check out the MySQL Cluster 7.2 New Features whitepaper to discover everything that is new in the latest GA release

    Read the article

  • qwebview in pyside after packaged with pyinstaller goes wrong

    - by truease.com
    Here's my code import sys from PySide.QtCore import * from PySide.QtGui import * from PySide.QtWebKit import * from encodings import * from codecs import * class BrowserWindow( QWidget ): def __init__( self, parent=None ): QWidget.__init__( self, parent ) self.Setup() self.SetupEvent() def Setup( self ): self.setWindowTitle( u"Truease Speedy Browser" ) self.addr_input = QLineEdit() self.addr_go = QPushButton( "GO" ) self.addr_bar = QHBoxLayout() self.addr_bar.addWidget( self.addr_input ) self.addr_bar.addWidget( self.addr_go ) for attr in [ QWebSettings.AutoLoadImages, QWebSettings.JavascriptEnabled, QWebSettings.JavaEnabled, QWebSettings.PluginsEnabled, QWebSettings.JavascriptCanOpenWindows, QWebSettings.JavascriptCanAccessClipboard, QWebSettings.DeveloperExtrasEnabled, QWebSettings.SpatialNavigationEnabled, QWebSettings.OfflineStorageDatabaseEnabled, QWebSettings.OfflineWebApplicationCacheEnabled, QWebSettings.LocalStorageEnabled, QWebSettings.LocalStorageDatabaseEnabled, QWebSettings.LocalContentCanAccessRemoteUrls, QWebSettings.LocalContentCanAccessFileUrls, ]: QWebSettings.globalSettings().setAttribute( attr, True ) self.web_view = QWebView() self.web_view.load( "http://www.baidu.com" ) layout = QVBoxLayout() layout.addLayout( self.addr_bar ) layout.addWidget( self.web_view ) self.setLayout( layout ) def SetupEvent( self ): self.connect( self.addr_input, SIGNAL("editingFinished()"), self, SLOT("Load()"), ) self.connect( self.addr_go, SIGNAL("pressed()"), self, SLOT("Load()") ) self.connect( self.web_view, SIGNAL("urlChanged(const QUrl&)"), self, SLOT("SetURL()"), ) def Load( self, *args, **kwargs ): url = self.GetCleanedURL() if url != self.CurrentURL(): self.web_view.load( url ) def SetURL( self, *args, **kwargs ): self.addr_input.setText( self.CurrentURL() ) def GetCleanedURL( self ): url = self.addr_input.text().strip() if not url.startswith("http"): url = "http://" + url return url def CurrentURL( self ): url = self.web_view.url().toString() return url def Main(): app = QApplication( sys.argv ) widget = BrowserWindow() widget.show() return app.exec_() if __name__ == '__main__': sys.exit( Main() ) I works well when i using python browser.py. but it goes wrong after packaged with pyinstaller -w browser.py. it doesn't load images can only display correct text in utf-8 And this is the pyinstaller output: E:\true\wuk\app2>pyinstaller -w b.py 16 INFO: wrote E:\true\wuk\app2\b.spec 16 INFO: Testing for ability to set icons, version resources... 32 INFO: ... resource update available 32 INFO: UPX is not available. 46 INFO: Processing hook hook-os 141 INFO: Processing hook hook-time 157 INFO: Processing hook hook-cPickle 218 INFO: Processing hook hook-_sre 312 INFO: Processing hook hook-cStringIO 407 INFO: Processing hook hook-encodings 421 INFO: Processing hook hook-codecs 750 INFO: Processing hook hook-httplib 750 INFO: Processing hook hook-email 843 INFO: Processing hook hook-email.message 1046 WARNING: library python%s%s required via ctypes not found 1171 INFO: Extending PYTHONPATH with E:\true\wuk\app2 1171 INFO: checking Analysis 1171 INFO: building because b.py changed 1171 INFO: running Analysis out00-Analysis.toc 1171 INFO: Adding Microsoft.VC90.CRT to dependent assemblies of final executable 1171 INFO: Searching for assembly x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21022.8_x-ww ... 1171 INFO: Found manifest C:\WINDOWS\WinSxS\Manifests\x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21022.8_x-ww_d08d0375.manifest 1187 INFO: Searching for file msvcr90.dll 1187 INFO: Found file C:\WINDOWS\WinSxS\x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21022.8_x-ww_d08d0375\msvcr90.dll 1187 INFO: Searching for file msvcp90.dll 1187 INFO: Found file C:\WINDOWS\WinSxS\x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21022.8_x-ww_d08d0375\msvcp90.dll 1187 INFO: Searching for file msvcm90.dll 1187 INFO: Found file C:\WINDOWS\WinSxS\x86_Microsoft.VC90.CRT_1fc8b3b9a1e18e3b_9.0.21022.8_x-ww_d08d0375\msvcm90.dll 1266 INFO: Analyzing D:\Applications\Python\lib\site-packages\pyinstaller-2.1-py2.7.egg\PyInstaller\loader\_pyi_bootstrap.py 1266 INFO: Processing hook hook-os 1282 INFO: Processing hook hook-site 1296 INFO: Processing hook hook-encodings 1391 INFO: Processing hook hook-time 1407 INFO: Processing hook hook-cPickle 1468 INFO: Processing hook hook-_sre 1578 INFO: Processing hook hook-cStringIO 1671 INFO: Processing hook hook-codecs 2016 INFO: Processing hook hook-httplib 2016 INFO: Processing hook hook-email 2109 INFO: Processing hook hook-email.message 2312 WARNING: library python%s%s required via ctypes not found 2468 INFO: Processing hook hook-pydoc 2516 INFO: Analyzing D:\Applications\Python\lib\site-packages\pyinstaller-2.1-py2.7.egg\PyInstaller\loader\pyi_importers.py 2609 INFO: Analyzing D:\Applications\Python\lib\site-packages\pyinstaller-2.1-py2.7.egg\PyInstaller\loader\pyi_archive.py 2687 INFO: Analyzing D:\Applications\Python\lib\site-packages\pyinstaller-2.1-py2.7.egg\PyInstaller\loader\pyi_carchive.py 2782 INFO: Analyzing D:\Applications\Python\lib\site-packages\pyinstaller-2.1-py2.7.egg\PyInstaller\loader\pyi_os_path.py 2782 INFO: Analyzing b.py 2796 INFO: Processing hook hook-PySide 2875 INFO: Hidden import 'codecs' has been found otherwise 2875 INFO: Hidden import 'encodings' has been found otherwise 2875 INFO: Looking for run-time hooks 7766 INFO: Using Python library C:\WINDOWS\system32\python27.dll 7796 INFO: E:\true\wuk\app2\build\b\out00-Analysis.toc no change! 7796 INFO: checking PYZ 7812 INFO: checking PKG 7812 INFO: building because E:\true\wuk\app2\build\b\b.exe.manifest changed 7812 INFO: building PKG (CArchive) out00-PKG.pkg 7828 INFO: checking EXE 7843 INFO: rebuilding out00-EXE.toc because pkg is more recent 7843 INFO: building EXE from out00-EXE.toc 7843 INFO: Appending archive to EXE E:\true\wuk\app2\build\b\b.exe 7843 INFO: checking COLLECT 7843 INFO: building COLLECT out00-COLLECT.toc Use pyinstaller browser.py, and in the console window i got QFont::setPixelSize: Pixel size <= 0 (0) QSslSocket: cannot call unresolved function SSLv23_client_method QSslSocket: cannot call unresolved function SSL_CTX_new QSslSocket: cannot call unresolved function SSL_library_init QSslSocket: cannot call unresolved function ERR_get_error QSslSocket: cannot call unresolved function SSLv23_client_method QSslSocket: cannot call unresolved function SSL_CTX_new QSslSocket: cannot call unresolved function SSL_library_init QSslSocket: cannot call unresolved function ERR_get_error QSslSocket: cannot call unresolved function SSLv23_client_method QSslSocket: cannot call unresolved function SSL_CTX_new QSslSocket: cannot call unresolved function SSL_library_init QSslSocket: cannot call unresolved function ERR_get_error QSslSocket: cannot call unresolved function SSLv23_client_method QSslSocket: cannot call unresolved function SSL_CTX_new QSslSocket: cannot call unresolved function SSL_library_init QSslSocket: cannot call unresolved function ERR_get_error QFont::setPixelSize: Pixel size <= 0 (0)

    Read the article

  • Data Processing in Business Intelligence Applications

    - by Manoj
    Is data processing a part of business intelligence tools? If no how else is it done or if yes can you give a few examples. My use case: I am using pentaho reporting to generate tables and graphs of my test data. Now I want to compare the data against the spec and see if it passes or fails and report the same. The spec is also in the database. How best to do it?

    Read the article

  • Processing potentially large STDIN data, more than once

    - by d11wtq
    I'd like to provide an accessor on a class that provides an NSInputStream for STDIN, which may be several hundred megabytes (or gigabytes, though unlikely, perhaps) of data. When I caller gets this NSInputStream it should be able to read from it without worrying about exhausting the data it contains. In other words, another block of code may request the NSInputStream and will expect to be able to read from it. Without first copying all of the data into an NSData object which (I assume) would cause memory exhaustion, what are my options for handling this? The returned NSInputStream does not have to be the same instance, it simply needs to provide the same data. The best I can come up with right now is to copy STDIN to a temporary file and then return NSInputStream instances using that file. Is this pretty much the only way to handle it? Is there anything I should be cautious of if I go the temporary file route?

    Read the article

  • Why .data() function of jQuery is better to prevent memory leaks?

    - by burak ozdogan
    Hi, Regarding to jQuery utility function jQuery.data() the online documentation says: "The jQuery.data() method allows us to attach data of any type to DOM elements in a way that is safe from circular references and therefore from memory leaks. " Why to use: document.body.foo = 52; can result a memory leak -or in what conditions- so that I should use jQuery.data(document.body, 'foo', 52); Should I ALWAYS prefer .data() instead of using expandos in any case? (I would appreciate if you can provide an example to compare the differences) Thanks, burak ozdogan

    Read the article

  • File copying utility like rsync with error handling like ddrescue, for data recovery from a hard drive with bad sectors or hardware failure

    - by purefusion
    I have a hard drive with either bad blocks or sectors that are failing to read due to potential mechanical issues, such as a bad disk head, bad motor, or some other issue that is causing the hard drive to read data excruciatingly slowly and with lots of read errors. I'm seeing an average of 50 KB/sec, with some reads dropping below 10 KB/sec, and frequently it gets stuck on a file or sector altogether, usually for quite a long time—from 2-10 minutes or more (when using rsync, before it times out). Speed seems to vary wildly, and it gets stuck on files a lot, and when it finally gets "unstuck" it only seems to last for a short burst before it gets stuck again. The drive is also very quiet with only an occasional sound of files copying (usually when it gets stuck/unstuck for a brief time, before getting stuck again). Thus, there are none of those evil sounds that are normally associated with HDD death. Someone suggested that the problems sounded like they might be caused by a misaligned disk head, which requires a lot of re-reads before it finally reads data with success. Sounds plausible, but I digress... Anyway, the problem with rsync is that it seems to have no decent error handling support. Obviously, it wasn't meant for use in recovering data from failing hard drives, but all the so-called "data recovery" utilities out there that are meant for such use usually focus on recovery of deleted files or messed up partitions, rather than copying files off dying hard drives. Deleted file recovery is not what I need, obviously, so perhaps you can understand my disappointment in not being able to find what I'm after yet. Naturally, this is where you'd probably say "You should use ddrescue!" Well, that's all fine and dandy, but I've already got most of the data backed up, so I just want to recover certain files. I'm not concerned with trying to recover a full partition block-by-block as ddrescue does. I am only interested in rescuing just specific files and directories. Ideally, what I'd like is some sort of cross between rsync and ddrescue: something that lets me specify source and destination as directories of normal files like rsync (rather than two full partitions as ddrescue requires), with a way to skip files with errors in an initial run, and then allows me to attempt recovery of those files with errors in a later run (with a slightly altered command, of course), perhaps even offering an option to specify the number of retry attempts ...just like how ddrescue works with blocks, only I want a utility that works with specific files/directories like rsync does. So am I daydreaming here, or does something out there exist that can do this? Or, maybe even a way to make rsync or ddrescue work in such a way? I'm really open to whatever solutions might work, so long as they let me choose which files I want to "rescue", and can skip files with errors in the initial run, and try/retry those errors again later. So far I've tried rsync with the following options, but it often gets stuck on a file for longer than the timeout, and ideally I'd just like it to move on to the next file and come back later to the files it gets stuck on. I don't think that's possible though. Anyway, here's what I've been using up till now: rsync -avP --stats --block-size=512 --timeout=600 /path/to/source/* /path/to/destination/

    Read the article

  • Client-side or server-side processing?

    - by Nick
    So, I'm new to dynamic web design (my sites have been mostly static with some PHP), and I'm trying to learn the latest technologies in web development (which seems to be AJAX), and I was wondering, if you're transferring a lot of data, is it better to construct the page on the server and "push" it to the user, or is it better to "pull" the data needed and create the HTML around it on the clientside using JavaScript? More specifically, I'm using CodeIgniter as my PHP framework, and jQuery for JavaScript, and if I wanted to display a table of data to the user (dynamically), would it be better to format the HTML using CodeIgniter (create the tables, add CSS classes to elements, etc..), or would it be better to just serve the raw data using JSON and then build it into a table with jQuery? My intuition says to do it clientside, as it would save bandwidth and the page would probably load quicker with the new JavaScript optimizations all these browsers have now, however, then the site would break for someone not using JavaScript... Thanks for the help

    Read the article

  • Using mcrypt to pass data across a webservice is failing

    - by adam
    Hi I'm writing an error handler script which encrypts the error data (file, line, error, message etc) and passes the serialized array as a POST variable (using curl) to a script which then logs the error in a central db. I've tested my encrypt/decrypt functions in a single file and the data is encrypted and decrypted fine: define('KEY', 'abc'); define('CYPHER', 'blowfish'); define('MODE', 'cfb'); function encrypt($data) { $td = mcrypt_module_open(CYPHER, '', MODE, ''); $iv = mcrypt_create_iv(mcrypt_enc_get_iv_size($td), MCRYPT_RAND); mcrypt_generic_init($td, KEY, $iv); $crypttext = mcrypt_generic($td, $data); mcrypt_generic_deinit($td); return $iv.$crypttext; } function decrypt($data) { $td = mcrypt_module_open(CYPHER, '', MODE, ''); $ivsize = mcrypt_enc_get_iv_size($td); $iv = substr($data, 0, $ivsize); $data = substr($data, $ivsize); if ($iv) { mcrypt_generic_init($td, KEY, $iv); $data = mdecrypt_generic($td, $data); } return $data; } echo "<pre>"; $data = md5(''); echo "Data: $data\n"; $e = encrypt($data); echo "Encrypted: $e\n"; $d = decrypt($e); echo "Decrypted: $d\n"; Output: Data: d41d8cd98f00b204e9800998ecf8427e Encrypted: ê÷#¯KžViiÖŠŒÆÜ,ÑFÕUW£´Œt?†÷>c×åóéè+„N Decrypted: d41d8cd98f00b204e9800998ecf8427e The problem is, when I put the encrypt function in my transmit file (tx.php) and the decrypt in my recieve file (rx.php), the data is not fully decrypted (both files have the same set of constants for key, cypher and mode). Data before passing: a:4:{s:3:"err";i:1024;s:3:"msg";s:4:"Oops";s:4:"file";s:46:"/Applications/MAMP/htdocs/projects/txrx/tx.php";s:4:"line";i:80;} Data decrypted: Mª4:{s:3:"err";i:1024@7OYªç`^;g";s:4:"Oops";s:4:"file";sôÔ8F•Ópplications/MAMP/htdocs/projects/txrx/tx.php";s:4:"line";i:80;} Note the random characters in the middle. My curl is fairly simple: $ch = curl_init($url); curl_setopt($ch, CURLOPT_POST, true); curl_setopt($ch, CURLOPT_POSTFIELDS, 'data=' . $data); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); $output = curl_exec($ch); Things I suspect could be causing this: Encoding of the curl request Something to do with mcrypt padding missing bytes I've been staring at it too long and have missed something really really obvious If I turn off the crypt functions (so the transfer tx-rx is unencrypted) the data is received fine. Any and all help much appreciated! Thanks, Adam

    Read the article

  • Synchronize Data between a Silverlight ListBox and a User Control

    - by psheriff
    One of the great things about XAML is the powerful data-binding capabilities. If you load up a list box with a collection of objects, you can display detail data about each object without writing any C# or VB.NET code. Take a look at Figure 1 that shows a collection of Product objects in a list box. When you click on a list box you bind the current Product object selected in the list box to a set of controls in a user control with just a very simple Binding statement in XAML.  Figure 1: Synchronizing a ListBox to a User Control is easy with Data Binding Product and Products Classes To illustrate this data binding feature I am going to just create some local data instead of using a WCF service. The code below shows a Product class that has three properties, namely, ProductId, ProductName and Price. This class also has a constructor that takes 3 parameters and allows us to set the 3 properties in an instance of our Product class. C#public class Product{  public Product(int productId, string productName, decimal price)  {    ProductId = productId;    ProductName = productName;    Price = price;  }   public int ProductId { get; set; }  public string ProductName { get; set; }  public decimal Price { get; set; }} VBPublic Class Product  Public Sub New(ByVal _productId As Integer, _                 ByVal _productName As String, _                 ByVal _price As Decimal)    ProductId = _productId    ProductName = _productName    Price = _price  End Sub   Private mProductId As Integer  Private mProductName As String  Private mPrice As Decimal   Public Property ProductId() As Integer    Get      Return mProductId    End Get    Set(ByVal value As Integer)      mProductId = value    End Set  End Property   Public Property ProductName() As String    Get      Return mProductName    End Get    Set(ByVal value As String)      mProductName = value    End Set  End Property   Public Property Price() As Decimal    Get      Return mPrice    End Get    Set(ByVal value As Decimal)      mPrice = value    End Set  End PropertyEnd Class To fill up a list box you need a collection class of Product objects. The code below creates a generic collection class of Product objects. In the constructor of the Products class I have hard-coded five product objects and added them to the collection. In a real-world application you would get your data through a call to service to fill the list box, but for simplicity and just to illustrate the data binding, I am going to just hard code the data. C#public class Products : List<Product>{  public Products()  {    this.Add(new Product(1, "Microsoft VS.NET 2008", 1000));    this.Add(new Product(2, "Microsoft VS.NET 2010", 1000));    this.Add(new Product(3, "Microsoft Silverlight 4", 1000));    this.Add(new Product(4, "Fundamentals of N-Tier eBook", 20));    this.Add(new Product(5, "ASP.NET Security eBook", 20));  }} VBPublic Class Products  Inherits List(Of Product)   Public Sub New()    Me.Add(New Product(1, "Microsoft VS.NET 2008", 1000))    Me.Add(New Product(2, "Microsoft VS.NET 2010", 1000))    Me.Add(New Product(3, "Microsoft Silverlight 4", 1000))    Me.Add(New Product(4, "Fundamentals of N-Tier eBook", 20))    Me.Add(New Product(5, "ASP.NET Security eBook", 20))  End SubEnd Class The Product Detail User Control Below is a user control (named ucProduct) that is used to display the product detail information seen in the bottom portion of Figure 1. This is very basic XAML that just creates a text block and a text box control for each of the three properties in the Product class. Notice the {Binding Path=[PropertyName]} on each of the text box controls. This means that if the DataContext property of this user control is set to an instance of a Product class, then the data in the properties of that Product object will be displayed in each of the text boxes. <UserControl x:Class="SL_SyncListBoxAndUserControl_CS.ucProduct"  xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"  xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"  HorizontalAlignment="Left"  VerticalAlignment="Top">  <Grid Margin="4">    <Grid.RowDefinitions>      <RowDefinition Height="Auto" />      <RowDefinition Height="Auto" />      <RowDefinition Height="Auto" />    </Grid.RowDefinitions>    <Grid.ColumnDefinitions>      <ColumnDefinition MinWidth="120" />      <ColumnDefinition />    </Grid.ColumnDefinitions>    <TextBlock Grid.Row="0"               Grid.Column="0"               Text="Product Id" />    <TextBox Grid.Row="0"             Grid.Column="1"             Text="{Binding Path=ProductId}" />    <TextBlock Grid.Row="1"               Grid.Column="0"               Text="Product Name" />    <TextBox Grid.Row="1"             Grid.Column="1"             Text="{Binding Path=ProductName}" />    <TextBlock Grid.Row="2"               Grid.Column="0"               Text="Price" />    <TextBox Grid.Row="2"             Grid.Column="1"             Text="{Binding Path=Price}" />  </Grid></UserControl> Synchronize ListBox with User Control You are now ready to fill the list box with the collection class of Product objects and then bind the SelectedItem of the list box to the Product detail user control. The XAML below is the complete code for Figure 1. <UserControl x:Class="SL_SyncListBoxAndUserControl_CS.MainPage"  xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"  xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"  xmlns:src="clr-namespace:SL_SyncListBoxAndUserControl_CS"  VerticalAlignment="Top"  HorizontalAlignment="Left">  <UserControl.Resources>    <src:Products x:Key="productCollection" />  </UserControl.Resources>  <Grid x:Name="LayoutRoot"        Margin="4"        Background="White">    <Grid.RowDefinitions>      <RowDefinition Height="Auto" />      <RowDefinition Height="*" />    </Grid.RowDefinitions>    <ListBox x:Name="lstData"             Grid.Row="0"             BorderBrush="Black"             BorderThickness="1"             ItemsSource="{Binding                   Source={StaticResource productCollection}}"             DisplayMemberPath="ProductName" />    <src:ucProduct x:Name="prodDetail"                   Grid.Row="1"                   DataContext="{Binding ElementName=lstData,                                          Path=SelectedItem}" />  </Grid></UserControl> The first step to making this happen is to reference the Silverlight project (SL_SyncListBoxAndUserControl_CS) where the Product and Products classes are located. I added this namespace and assigned it a namespace prefix of “src” as shown in the line below: xmlns:src="clr-namespace:SL_SyncListBoxAndUserControl_CS" Next, to use the data from an instance of the Products collection, you create a UserControl.Resources section in the XAML and add a tag that creates an instance of the Products class and assigns it a key of “productCollection”.   <UserControl.Resources>    <src:Products x:Key="productCollection" />  </UserControl.Resources> Next, you bind the list box to this productCollection object using the ItemsSource property. You bind the ItemsSource of the list box to the static resource named productCollection. You can then set the DisplayMemberPath attribute of the list box to any property of the Product class that you want. In the XAML below I used the ProductName property. <ListBox x:Name="lstData"         ItemsSource="{Binding             Source={StaticResource productCollection}}"         DisplayMemberPath="ProductName" /> You now need to create an instance of the ucProduct user contol below the list box. You do this by once again referencing the “src” namespace and typing in the name of the user control. You then set the DataContext property on this user control to a binding. The binding uses the ElementName attribute to bind to the list box name, in this case “lstData”. The Path of the data is SelectedItem. These two attributes together tell Silverlight to bind the DataContext to the selected item of the list box. That selected item is a Product object. So, once this is bound, the bindings on each text box in the user control are updated and display the current product information. <src:ucProduct x:Name="prodDetail"               DataContext="{Binding ElementName=lstData,                                      Path=SelectedItem}" /> Summary Once you understand the basics of data binding in XAML, you eliminate a lot code that is otherwise needed to move data into controls and out of controls back into an object. Connecting two controls together is easy by just binding using the ElementName and Path properties of the Binding markup extension. Another good tip out of this blog is use user controls and set the DataContext of the user control to have all of the data on the user control update through the bindings. NOTE: You can download the complete sample code (in both VB and C#) at my website. http://www.pdsa.com/downloads. Choose Tips & Tricks, then "SL – Synchronize List Box Data with User Control" from the drop-down. Good Luck with your Coding,Paul Sheriff ** SPECIAL OFFER FOR MY BLOG READERS **Visit http://www.pdsa.com/Event/Blog for a free eBook on "Fundamentals of N-Tier".

    Read the article

  • Software center is broken

    - by Colin
    When I started installing the Humble Indie Bundle 5 games the software center stopped working and now I get this error. Packages cannot be installed or removed, click here to repair. Which fails and gives these results. installArchives() failed: (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 255502 files and directories currently installed.) Unpacking libqtcore4:i386 (from .../libqtcore4_4%3a4.8.1-0ubuntu4.1_i386.deb) ... dpkg: error processing /var/cache/apt/archives/libqtcore4_4%3a4.8.1-0ubuntu4.1_i386.deb (--unpack): conffile './etc/xdg/Trolltech.conf' is not in sync with other instances of the same package No apport report written because MaxReports is reached already Errors were encountered while processing: /var/cache/apt/archives/libqtcore4_4%3a4.8.1-0ubuntu4.1_i386.deb Error in function: dpkg: dependency problems prevent configuration of libqtgui4:i386: libqtgui4:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqtgui4:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-sql:i386: libqt4-sql:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-sql:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of ia32-libs-multiarch:i386: ia32-libs-multiarch:i386 depends on libqt4-sql; however: Package libqt4-sql:i386 is not configured yet. ia32-libs-multiarch:i386 depends on libqtcore4; however: Package libqtcore4:i386 is not installed. ia32-libs-multiarch:i386 depends on libqtgui4; however: Package libqtgui4:i386 is not configured yet. dpkg: error processing ia32-libs-multiarch:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-declarative:i386: libqt4-declarative:i386 depends on libqt4-sql (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-sql:i386 is not configured yet. libqt4-declarative:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. libqt4-declarative:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-declarative:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-svg:i386: libqt4-svg:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. libqt4-svg:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-svg:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-network:i386: libqt4-network:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-network:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-sql-mysql:i386: libqt4-sql-mysql:i386 depends on libqt4-sql (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-sql:i386 is not configured yet. libqt4-sql-mysql:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-sql-mysql:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-script:i386: libqt4-script:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-script:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-dbus:i386: libqt4-dbus:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-dbus:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-opengl:i386: libqt4-opengl:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. libqt4-opengl:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-opengl:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqtwebkit4:i386: libqtwebkit4:i386 depends on libqt4-network (>= 4:4.8.0~); however: Package libqt4-network:i386 is not configured yet. libqtwebkit4:i386 depends on libqtcore4 (>= 4:4.8.0~); however: Package libqtcore4:i386 is not installed. libqtwebkit4:i386 depends on libqtgui4 (>= 4:4.8.0); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqtwebkit4:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-designer:i386: libqt4-designer:i386 depends on libqt4-script (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-script:i386 is not configured yet. libqt4-designer:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. libqt4-designer:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-designer:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of lonesurvivor-bin:i386: lonesurvivor-bin:i386 depends on ia32-libs-multiarch; however: Package ia32-libs-multiarch:i386 is not configured yet. dpkg: error processing lonesurvivor-bin:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of lonesurvivor: lonesurvivor depends on lonesurvivor-bin (= 1.11d-0ubuntu5); however: Package lonesurvivor-bin is not installed. Package lonesurvivor-bin:i386 is not configured yet. dpkg: error processing lonesurvivor (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-scripttools:i386: libqt4-scripttools:i386 depends on libqt4-script (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-script:i386 is not configured yet. libqt4-scripttools:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. libqt4-scripttools:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-scripttools:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-qt3support:i386: libqt4-qt3support:i386 depends on libqt4-designer (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-designer:i386 is not configured yet. libqt4-qt3support:i386 depends on libqt4-network (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-network:i386 is not configured yet. libqt4-qt3support:i386 depends on libqt4-sql (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-sql:i386 is not configured yet. libqt4-qt3support:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. libqt4-qt3support:i386 depends on libqtgui4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtgui4:i386 is not configured yet. dpkg: error processing libqt4-qt3support:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-xml:i386: libqt4-xml:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-xml:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-test:i386: libqt4-test:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-test:i386 (--configure): dependency problems - leaving unconfigured dpkg: dependency problems prevent configuration of libqt4-xmlpatterns:i386: libqt4-xmlpatterns:i386 depends on libqt4-network (= 4:4.8.1-0ubuntu4.1); however: Package libqt4-network:i386 is not configured yet. libqt4-xmlpatterns:i386 depends on libqtcore4 (= 4:4.8.1-0ubuntu4.1); however: Package libqtcore4:i386 is not installed. dpkg: error processing libqt4-xmlpatterns:i386 (--configure): dependency problems - leaving unconfigured I couldn't get the apt-get update or upgrade to work either so I shut off the repositories and updated / upgraded one at a time without any problems. But that didn't fix the Software Center. Help would be greatly appreciated. ADDED UPDATE I've tried to install aptitude using dpkg but can't. I have also tried sudo apt-get dist-upgrade sudo apt-get autoremove sudo dpkg --configure -a --force-all and the -f options. Though I believe this is where my problem is originating: Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following extra packages will be installed: libqtcore4:i386 The following NEW packages will be installed: libqtcore4:i386 0 upgraded, 1 newly installed, 0 to remove and 0 not upgraded. Need to get 0 B/2,061 kB of archives. After this operation, 9,041 kB of additional disk space will be used. Do you want to continue [Y/n]? y (Reading database ... 255526 files and directories currently installed.) Unpacking libqtcore4:i386 (from .../libqtcore4_4%3a4.8.1-0ubuntu4.1_i386.deb) ... dpkg: error processing /var/cache/apt/archives/libqtcore4_4%3a4.8.1-0ubuntu4.1_i386.deb (--unpack): conffile './etc/xdg/Trolltech.conf' is not in sync with other instances of the same package Errors were encountered while processing: /var/cache/apt/archives/libqtcore4_4%3a4.8.1-0ubuntu4.1_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1) I hope this helps narrow it down some. SECOND UPDATE sudo apt-get --reinstall install software-center -f The following packages have unmet dependencies: ia32-libs-multiarch:i386 : Depends: libqtcore4:i386 but it is not going to be installed libqt4-dbus:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-declarative:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-designer:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-network:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-opengl:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-qt3support:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-script:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-scripttools:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-sql:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-sql-mysql:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-svg:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-test:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-xml:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqt4-xmlpatterns:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqtgui4:i386 : Depends: libqtcore4:i386 (= 4:4.8.1-0ubuntu4.1) but it is not going to be installed libqtwebkit4:i386 : Depends: libqtcore4:i386 (>= 4:4.8.0~) but it is not going to be installed E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution). sudo apt-get -f install doesn't work either. Complete output of terminal for step 5 can be found here, http://paste.ubuntu.com/1066192/

    Read the article

  • getting database connectivity issue from only one part of my ASP.net project?

    - by Greg
    Hi, Something weird has started happening to my project with Dynamic Data. Suddenly I am now getting connection errors when going to the DD navigation pages, but things work fine on the default DD main page, and also other MVC pages I've added in myself work fine to the database. Any ideas? So in summary if I go to the following web pages: custom URL for my custom controller - this works fine, including getting database data main DD page from root URL - this works fine click on a link to a table maintenance page from the main DD page - GET DATABASE CONNECTIVITY ERROR Some items: * I'm using SQL Server Express 2008 * Doesn't seem to be any debug/error info in VS2010 at all I can see * Web config entry: <add name="Model1Container" connectionString="metadata=res://*/Model1.csdl|res://*/Model1.ssdl|res://*/Model1.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=GREG\SQLEXPRESS_2008;Initial Catalog=greg_development;Integrated Security=True;MultipleActiveResultSets=True&quot;" providerName="System.Data.EntityClient"/> Error: Server Error in '/' Application. A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.SqlClient.SqlException: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) Source Error: Line 39: DropDownList1.Items.Add(new ListItem("[Not Set]", NullValueString)); Line 40: } Line 41: PopulateListControl(DropDownList1); Line 42: // Set the initial value if there is one Line 43: string initialValue = DefaultValue; Source File: U:\My Dropbox\source\ToplogyLibrary\Topology_Web_Dynamic\DynamicData\Filters\ForeignKey.ascx.cs Line: 41 Stack Trace: [SqlException (0x80131904): A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)] System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +5009598 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning() +234 System.Data.SqlClient.TdsParser.Connect(ServerInfo serverInfo, SqlInternalConnectionTds connHandler, Boolean ignoreSniOpenTimeout, Int64 timerExpire, Boolean encrypt, Boolean trustServerCert, Boolean integratedSecurity) +341 System.Data.SqlClient.SqlInternalConnectionTds.AttemptOneLogin(ServerInfo serverInfo, String newPassword, Boolean ignoreSniOpenTimeout, TimeoutTimer timeout, SqlConnection owningObject) +129 System.Data.SqlClient.SqlInternalConnectionTds.LoginNoFailover(ServerInfo serverInfo, String newPassword, Boolean redirectedUserInstance, SqlConnection owningObject, SqlConnectionString connectionOptions, TimeoutTimer timeout) +239 System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, TimeoutTimer timeout, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance) +195 System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance) +232 System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection) +185 System.Data.ProviderBase.DbConnectionFactory.CreatePooledConnection(DbConnection owningConnection, DbConnectionPool pool, DbConnectionOptions options) +33 System.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject) +524 System.Data.ProviderBase.DbConnectionPool.UserCreateRequest(DbConnection owningObject) +66 System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject) +479 System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) +108 System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) +126 System.Data.SqlClient.SqlConnection.Open() +125 System.Data.EntityClient.EntityConnection.OpenStoreConnectionIf(Boolean openCondition, DbConnection storeConnectionToOpen, DbConnection originalConnection, String exceptionCode, String attemptedOperation, Boolean& closeStoreConnectionOnFailure) +52 [EntityException: The underlying provider failed on Open.] System.Data.EntityClient.EntityConnection.OpenStoreConnectionIf(Boolean openCondition, DbConnection storeConnectionToOpen, DbConnection originalConnection, String exceptionCode, String attemptedOperation, Boolean& closeStoreConnectionOnFailure) +161 System.Data.EntityClient.EntityConnection.Open() +98 System.Data.Objects.ObjectContext.EnsureConnection() +81 System.Data.Objects.ObjectQuery`1.GetResults(Nullable`1 forMergeOption) +46 System.Data.Objects.ObjectQuery`1.System.Collections.Generic.IEnumerable<T>.GetEnumerator() +44 System.Data.Objects.ObjectQuery`1.GetEnumeratorInternal() +36 System.Data.Objects.ObjectQuery.System.Collections.IEnumerable.GetEnumerator() +10 System.Web.DynamicData.Misc.FillListItemCollection(IMetaTable table, ListItemCollection listItemCollection) +50 System.Web.DynamicData.QueryableFilterUserControl.PopulateListControl(ListControl listControl) +85 Topology_Web_Dynamic.ForeignKeyFilter.Page_Init(Object sender, EventArgs e) in U:\My Dropbox\source\ToplogyLibrary\Topology_Web_Dynamic\DynamicData\Filters\ForeignKey.ascx.cs:41 System.Web.Util.CalliHelper.EventArgFunctionCaller(IntPtr fp, Object o, Object t, EventArgs e) +14 System.Web.Util.CalliEventHandlerDelegateProxy.Callback(Object sender, EventArgs e) +35 System.Web.UI.Control.OnInit(EventArgs e) +91 System.Web.UI.UserControl.OnInit(EventArgs e) +83 System.Web.UI.Control.InitRecursive(Control namingContainer) +140 System.Web.UI.Control.AddedControl(Control control, Int32 index) +197 System.Web.UI.ControlCollection.Add(Control child) +79 System.Web.DynamicData.DynamicFilter.EnsureInit(IQueryableDataSource dataSource) +200 System.Web.DynamicData.QueryableFilterRepeater.<Page_InitComplete>b__1(DynamicFilter f) +11 System.Collections.Generic.List`1.ForEach(Action`1 action) +145 System.Web.DynamicData.QueryableFilterRepeater.Page_InitComplete(Object sender, EventArgs e) +607 System.EventHandler.Invoke(Object sender, EventArgs e) +0 System.Web.UI.Page.OnInitComplete(EventArgs e) +8871862 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +604 Version Information: Microsoft .NET Framework Version:4.0.30319; ASP.NET Version:4.0.30319.1

    Read the article

  • SQL SERVER – How to Recover SQL Database Data Deleted by Accident

    - by Pinal Dave
    In Repair a SQL Server database using a transaction log explorer, I showed how to use ApexSQL Log, a SQL Server transaction log viewer, to recover a SQL Server database after a disaster. In this blog, I’ll show you how to use another SQL Server disaster recovery tool from ApexSQL in a situation when data is accidentally deleted. You can download ApexSQL Recover here, install, and play along. With a good SQL Server disaster recovery strategy, data recovery is not a problem. You have a reliable full database backup with valid data, a full database backup and subsequent differential database backups, or a full database backup and a chain of transaction log backups. But not all situations are ideal. Here we’ll address some sub-optimal scenarios, where you can still successfully recover data. If you have only a full database backup This is the least optimal SQL Server disaster recovery strategy, as it doesn’t ensure minimal data loss. For example, data was deleted on Wednesday. Your last full database backup was created on Sunday, three days before the records were deleted. By using the full database backup created on Sunday, you will be able to recover SQL database records that existed in the table on Sunday. If there were any records inserted into the table on Monday or Tuesday, they will be lost forever. The same goes for records modified in this period. This method will not bring back modified records, only the old records that existed on Sunday. If you restore this full database backup, all your changes (intentional and accidental) will be lost and the database will be reverted to the state it had on Sunday. What you have to do is compare the records that were in the table on Sunday to the records on Wednesday, create a synchronization script, and execute it against the Wednesday database. If you have a full database backup followed by differential database backups Let’s say the situation is the same as in the example above, only you create a differential database backup every night. Use the full database backup created on Sunday, and the last differential database backup (created on Tuesday). In this scenario, you will lose only the data inserted and updated after the differential backup created on Tuesday. If you have a full database backup and a chain of transaction log backups This is the SQL Server disaster recovery strategy that provides minimal data loss. With a full chain of transaction logs, you can recover the SQL database to an exact point in time. To provide optimal results, you have to know exactly when the records were deleted, because restoring to a later point will not bring back the records. This method requires restoring the full database backup first. If you have any differential log backup created after the last full database backup, restore the most recent one. Then, restore transaction log backups, one by one, it the order they were created starting with the first created after the restored differential database backup. Now, the table will be in the state before the records were deleted. You have to identify the deleted records, script them and run the script against the original database. Although this method is reliable, it is time-consuming and requires a lot of space on disk. How to easily recover deleted records? The following solution enables you to recover SQL database records even if you have no full or differential database backups and no transaction log backups. To understand how ApexSQL Recover works, I’ll explain what happens when table data is deleted. Table data is stored in data pages. When you delete table records, they are not immediately deleted from the data pages, but marked to be overwritten by new records. Such records are not shown as existing anymore, but ApexSQL Recover can read them and create undo script for them. How long will deleted records stay in the MDF file? It depends on many factors, as time passes it’s less likely that the records will not be overwritten. The more transactions occur after the deletion, the more chances the records will be overwritten and permanently lost. Therefore, it’s recommended to create a copy of the database MDF and LDF files immediately (if you cannot take your database offline until the issue is solved) and run ApexSQL Recover on them. Note that a full database backup will not help here, as the records marked for overwriting are not included in the backup. First, I’ll delete some records from the Person.EmailAddress table in the AdventureWorks database.   I can delete these records in SQL Server Management Studio, or execute a script such as DELETE FROM Person.EmailAddress WHERE BusinessEntityID BETWEEN 70 AND 80 Then, I’ll start ApexSQL Recover and select From DELETE operation in the Recovery tab.   In the Select the database to recover step, first select the SQL Server instance. If it’s not shown in the drop-down list, click the Server icon right to the Server drop-down list and browse for the SQL Server instance, or type the instance name manually. Specify the authentication type and select the database in the Database drop-down list.   In the next step, you’re prompted to add additional data sources. As this can be a tricky step, especially for new users, ApexSQL Recover offers help via the Help me decide option.   The Help me decide option guides you through a series of questions about the database transaction log and advises what files to add. If you know that you have no transaction log backups or detached transaction logs, or the online transaction log file has been truncated after the data was deleted, select No additional transaction logs are available. If you know that you have transaction log backups that contain the delete transactions you want to recover, click Add transaction logs. The online transaction log is listed and selected automatically.   Click Add if to add transaction log backups. It would be best if you have a full transaction log chain, as explained above. The next step for this option is to specify the time range.   Selecting a small time range for the time of deletion will create the recovery script just for the accidentally deleted records. A wide time range might script the records deleted on purpose, and you don’t want that. If needed, you can check the script generated and manually remove such records. After that, for all data sources options, the next step is to select the tables. Be careful here, if you deleted some data from other tables on purpose, and don’t want to recover them, don’t select all tables, as ApexSQL Recover will create the INSERT script for them too.   The next step offers two options: to create a recovery script that will insert the deleted records back into the Person.EmailAddress table, or to create a new database, create the Person.EmailAddress table in it, and insert the deleted records. I’ll select the first one.   The recovery process is completed and 11 records are found and scripted, as expected.   To see the script, click View script. ApexSQL Recover has its own script editor, where you can review, modify, and execute the recovery script. The insert into statements look like: INSERT INTO Person.EmailAddress( BusinessEntityID, EmailAddressID, EmailAddress, rowguid, ModifiedDate) VALUES( 70, 70, N'[email protected]' COLLATE SQL_Latin1_General_CP1_CI_AS, 'd62c5b4e-c91f-403f-b630-7b7e0fda70ce', '20030109 00:00:00.000' ); To execute the script, click Execute in the menu.   If you want to check whether the records are really back, execute SELECT * FROM Person.EmailAddress WHERE BusinessEntityID BETWEEN 70 AND 80 As shown, ApexSQL Recover recovers SQL database data after accidental deletes even without the database backup that contains the deleted data and relevant transaction log backups. ApexSQL Recover reads the deleted data from the database data file, so this method can be used even for databases in the Simple recovery model. Besides recovering SQL database records from a DELETE statement, ApexSQL Recover can help when the records are lost due to a DROP TABLE, or TRUNCATE statement, as well as repair a corrupted MDF file that cannot be attached to as SQL Server instance. You can find more information about how to recover SQL database lost data and repair a SQL Server database on ApexSQL Solution center. There are solutions for various situations when data needs to be recovered. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Helicon ISAPI Rewrite is processing Rewrite Rules for httpd.conf but not .htaccess

    - by James Lawruk
    We have Helicon ISAPI Rewrite 3 installed on our Windows 2003 Web server. The RewriteRules work fine in the global file located in the httpd.conf file. The server serves several Web sites and we were hoping to create RewriteRules to apply to specific Web sites. In the IIS Properties for each Web site there exists a separate tab for ISAPI_Rewrite pointing to the .htaccess file for that Web site. No rules applied to the .htaccess files work. Any ideas why the .htaaccess files have no effect.

    Read the article

  • Spatial data in the UK

    - by simonsabin
    I am just loving the fact that the Ordance Survey has now released a huge amount of data that can be used freely. I’ve downloaded the Panorama (tm) data http://www.ordnancesurvey.co.uk/oswebsite/products/land-form-panorama-contours/index.html . which is all the contours for the UK This I’ve loaded into SQL Server using Safe Computing’s FME ( http://www.safe.com/ ). This is because the data is a Autocad DXF file and translating that to SQL Server spatial data is not easy. The FME workbench is not...(read more)

    Read the article

  • OS X Snow Leopard stops processing clicks from Wacom Intuous2

    - by antiver
    I'm using a Wacom Intuos2 graphics tablet in OS X Snow Leopard 10.6.2, which works great, 80% of the time. Every 20 minutes or so, OS X to just start ignoring input from the pen. More specifically, only the pen tip is ignored - the side buttons on the pen still work. Putting the Mac to sleep and waking it back up restores functionality to the pen tip. This is using both of the latest Wacom drivers claiming compatibility with Snow Leopard: versions 6.1.2-5 (Nov 25, 2009) and 6.1.3-3 (Jan 21, 2010). I have no experience with this tablet with other version of OS X / drivers. The tablet works 100% in Windows, which leads to blame either OS X or the OS X drivers.

    Read the article

  • Linux: Simulate Serial Connection from Arduino

    - by shanet
    I'm trying to simulate the serial connection from an Arduino into a Processing applet since I don't have an Arduino at the moment. Simply, I'm trying to just send bytes from Bash to a serial connection (on /dev/ttyS0) which the Processing applet will pick up like it would from an Arduino. I tried the answer to this question: How can I send data to the serial port from a Linux shell?, but it's simply not working and I don't know how to go about debugging something like this since I've never played with serial connections before. Any advice? Thanks much.

    Read the article

  • This Week on the Green Data Center Management Front

    Among the big news this week in green data center management: Horizon Data Center Solutions announces a partnership with Power Loft, Cisco, GDSYS, and Incheon Metropolitan City partner to build a next-generation green data center, and Vycon Energy demonstrates how its flywheel backup power systems can be used for health-care data system power backup.

    Read the article

  • SharePoint Search: processing filenames containing underscores

    - by Todd Owen
    We use SharePoint Server 2007 to allow employees to search network file shares, but it seems that underscores in filenames are not treated as word separators when indexing the files. As a result, a search for chocolate will: match "chocolate milkshake.doc" but not match "chocolate_cake.doc" (Of course, this is a simplified example; in practice the content of the second file might include the word "chocolate" and match on that instead of the filename. But the problem itself is real enough, because a common scenario in a corporate environment is that a user knows the the partial name of the file they are looking for and expects to see matching filenames at the top of the search results. And using underscores in filenames is a widely used convention within our company). Underscores are not treated as word separators in the file content either, although this is less of a concern for us. The root cause of this problem is possibly related to the behaviour of the word breakers that SharePoint uses (i.e. the language-specific DLLs that implement the IWorkBreaker interface), although I haven't confirmed this yet. Does anyone know of a workaround for this issue? I have tested with Search Server 2008 Express too (which is based on the same technology), and it is also affected. I do not know whether the problem is fixed in SharePoint 2010 or not.

    Read the article

  • Real-time Big Data Analytics is a reality for StubHub with Oracle Advanced Analytics

    - by Mark Hornick
    What can you use for a comprehensive platform for real-time analytics? How can you process big data volumes for near-real-time recommendations and dramatically reduce fraud? Learn in this video what Stubhub achieved with Oracle R Enterprise from the Oracle Advanced Analytics option to Oracle Database, and read more on their story here. Advanced analytics solutions that impact the bottom line of a business are challenging due to the range of skills and individuals involved in realizing such solutions. While we hear a lot about the role of the data scientist, that role is but one piece of the puzzle. Advanced analytics solutions also have an operationalization aspect that also requires close proximity to where the transactional activity occurs. The data scientist needs access to the right data with which to model the business problem. This involves IT for data collection, management, and administration, as well as ensuring zero downtime (a website needs to be up 24x7). This also involves working with the data scientist to keep predictive models refreshed with the latest scripts. Integrating advanced analytics solutions into enterprise apps involves not just generating predictions, but supporting the whole life-cycle from data collection, to model building, model assessment, and then outcome assessment and feedback to the model building process again. Application and web interface designers need to take into account how end users will see and use the advanced analytics results, e.g., supporting operations staff that need to handle the potentially fraudulent transactions. As just described, advanced analytics projects can be "complicated" from just a human perspective. The extent to which software can simplify the interactions among users and systems will increase the likelihood of project success. The ability to quickly operationalize advanced analytics projects and demonstrate measurable value, means the difference between a successful project and just a nice research report. By standardizing on Oracle Database and SQL invocation of R, along with in-database modeling as found in Oracle Advanced Analytics, expedient model deployment and zero downtime for refreshing models becomes a reality. Meanwhile, data scientists are also able to explore leading edge techniques available in open source. The Oracle solution propels the entire organization forward to realize the value of advanced analytics.

    Read the article

  • This Week's News From the Green Data Center Management Front

    Among this week's developments in green data center management: myriad federal and state tax credits are now available for Green IT projects related to data centers; the EPA is finalizing its Energy Star program for data centers; and Numara Software has a way to help you better green your data center.

    Read the article

  • Batch processing multi-TIFF in Irfan view

    - by hemalshah
    I have to convert DPI of more than 5k Tiff images on a monthly basis from 200x200 to 100x100. I can do that in Irfan view using a .bat file that i have created.. the following is the .BAT file code @"c:\program files\irfanview\i_view32.exe" "e:\batch1*.tif /aspectratio /resample /tifc=4 /dpi=(100,100) /convert=e:\batch2*.tif" %* Where tifc=4 is Fax 4 compression However, the above code doesn't help me change the DPI for other pages except for Only the first page in the tiff thats getting converted to 100 DPI. Rest all pages are still 200 DPI. I am using WinXP Professional and Irfan View. Can anyone tell me what I am missing. Or any other alternative program where I can create a .bat file and run the batch process using Command line?

    Read the article

< Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >