Search Results

Search found 97 results on 4 pages for 'xl'.

Page 4/4 | < Previous Page | 1 2 3 4 

  • Guessing Excel Data Types

    - by AjarnMark
    Note to Self HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel: TypeGuessRows = 0 means scan everything. Note to Others About 10 years ago I stumbled across this bit of information just when I needed it and it saved my project.  Then for some reason, a few years later when it would have been nice, but not critical, for some reason I could not find it again anywhere.  Well, now I have stumbled across it again, and to preserve my future self from nightmares and sudden baldness due to pulling my hair out, I have decided to blog it in the hopes that I can find it again this way. Here’s the story…  When you query data from an Excel spreadsheet, such as with old-fashioned DTS packages in SQL 2000 (my first reference) or simply with an OLEDB Data Adapter from ASP.NET (recent task) and if you are using the Microsoft Jet 4.0 driver (newer ones may deal with this differently) then you can get funny results where the query reports back that a cell value is null even when you know it contains data. What happens is that Excel doesn’t really have data types.  While you can format information in cells to appear like certain data types (e.g. Date, Time, Decimal, Text, etc.) that is not really defining the cell as being of a certain type like we think of when working with databases.  But, presumably, to make things more convenient for the user (programmer) when you issue a query against Excel, the query processor tries to guess what type of data is contained in each column and returns it in an appropriate manner.  This is all well and good IF your data is consistent in every row and matches what the processor guessed.  And, for efficiency’s sake, when the query processor is trying to figure out each column’s data type, it does so by analyzing only the first 8 rows of data (default setting). Now here’s the problem, suppose that your spreadsheet contains information about clothing, and one of the columns is Size.  Now suppose that in the first 8 rows, all of your sizes look like 32, 34, 18, 10, and so on, using numbers, but then, somewhere after the 8th row, you have some rows with sizes like S, M, L, XL.  What happens is that by examining only the first 8 rows, the query processor inferred that the column contained numerical data, and then when it hits the non-numerical data in later rows, it comes back blank.  Major bummer, and a real pain to track down if you don’t know that Excel is doing this, because you study the spreadsheet and say, “the data is RIGHT THERE!  WHY doesn’t the query see it?!?!”  And the hair-pulling begins. So, what’s a developer to do?  One option is to go to the registry setting noted above and change the DWORD value of TypeGuessRows from the default of 8 to 0 (zero).  Setting this value to zero will force Jet to scan every row in the spreadsheet before making its determination as to what type of data the column contains.  And that means that in the example above, it would have treated the column as a string rather than as numeric, and presto! your query now returns all of the values that you know are in there. Of course, there is a caveat… if you are querying large spreadsheets, making Jet scan every row can be quite a performance hit.  You could enter a different number (more than 8) that you believe is a better sampling of rows to make the guess, but you still have the possibility that every row scanned looks alike, but that later rows are different, and that you might get blanks when there really is data there.  That’s the type of gamble, I really don’t like to take with my data. Anyone with a better approach, or with experience with more recent drivers that have a better way of handling data types, please chime in!

    Read the article

  • Sparse virtual machine disk image resizing weirdness?

    - by Matt H
    I have a partitioned virtual machine disk image created by vmware. What I want to do is resize that by 10GB. The file size is showing as 64424509440. Or 60GB. So I ran this: dd if=/dev/zero of=./win7.img seek=146800640 count=0 It ran without errors and I can verify the new size is in fact 75161927680 bytes or 70GB. This is where it gets a little odd. I started the guest domain in xen which is a Windows 7 enterprise machine. What I was expecting to see in diskmgmt.msc is 2 partitions. 1 system partition at the start of around 100MB and near 60GB partition (which is C drive) followed by around 10GB of free space. Actually what I saw was a 70GB partition!?! That confused me... so I decided to run the Check Disk which when you set it on the C drive it asks you to reboot so it'll run on boot. So I did that and during the boot it ran the checks. It got all the way through stage 3 and didn't show any errors at all. Looked at the partitions in disk manager and now C drive has shrunk back to 60GB and there is no free space. What gives? Ok, I thought I'd try mounting it under Dom0 and examining it with fdisk. This is what I get when mounted sudo xl block-attach 0 tap:aio:/home/xen/vms/otoy_v1202-xen.img xvda w sudo fdisk -l /dev/xvda Disk /dev/xvda: 64.4 GB, 64424509440 bytes 255 heads, 63 sectors/track, 7832 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x582dfc96 Device Boot Start End Blocks Id System /dev/xvda1 * 1 13 102400 7 HPFS/NTFS Partition 1 does not end on cylinder boundary. /dev/xvda2 13 7833 62810112 7 HPFS/NTFS Note the cylinder boundary comment. When I run sudo cfdisk /dev/xvda I get: FATAL ERROR: Bad primary partition 1: Partition ends in the final partial cylinder Press any key to exit cfdisk So I guess this is a bigger problem than first thought. How can I fix this? EDIT: Oops, the cylinder boundary thing is not a problem at all since disks have used LBA etc. So that threw me for a moment... still the problem exists... Now this output looks a little different. sudo sfdisk -uS -l /dev/xvda Disk /dev/xvda: 7832 cylinders, 255 heads, 63 sectors/track Units = sectors of 512 bytes, counting from 0 Device Boot Start End #sectors Id System /dev/xvda1 * 2048 206847 204800 7 HPFS/NTFS /dev/xvda2 206848 125827071 125620224 7 HPFS/NTFS /dev/xvda3 0 - 0 0 Empty /dev/xvda4 0 - 0 0 Empty BTW: I do have a backup of the image so if you help me mess it up that's ok. EDIT: sudo parted /dev/xvda print free Model: Xen Virtual Block Device (xvd) Disk /dev/xvda: 64.4GB Sector size (logical/physical): 512B/512B Partition Table: msdos Number Start End Size Type File system Flags 32.3kB 1049kB 1016kB Free Space 1 1049kB 106MB 105MB primary ntfs boot 2 106MB 64.4GB 64.3GB primary ntfs 64.4GB 64.4GB 1049kB Free Space Cool. Linux is showing free space is 10GB which is what I expect. The problem is windows isn't seeing this?

    Read the article

  • Weblogic JDBC datasource,java.sql.SQLException: Cannot obtain XAConnection weblogic.common.resourcep

    - by gauravkarnatak
    I am using weblogic JDBC datasource and my DB is oracle 10g,below is the configuration. It used to work fine but suddenly it started giving problem,please see below exception. Weblogic JDBC datasource,java.sql.SQLException: Cannot obtain XAConnection weblogic.common.resourcepool.ResourceLimitException: No resources currently available in pool <?xml version="1.0" encoding="UTF-8"?> <jdbc-data-source xmlns="http://www.bea.com/ns/weblogic/90" xmlns:sec="http://www.bea.com/ns/weblogic/90/security" xmlns:wls="http://www.bea.com/ns/weblogic/90/security/wls" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.bea.com/ns/weblogic/920 http://www.bea.com/ns/weblogic/920.xsd" XL-Reference-DS jdbc:oracle:oci:@abc.COM oracle.jdbc.driver.OracleDriver user DEV_260908 password password dll ocijdbc10 protocol oci oracle.jdbc.V8Compatible true baseDriverClass oracle.jdbc.driver.OracleDriver 1 100 1 true SQL SELECT 1 FROM DUAL DataJndi OnePhaseCommit This exception is coming on dev environment where connected user is only one. I know, this is related to pool max size but I also suspect this could be due to oracle,might be oracle isn't able to create connections. below are my queries Is there any debug/logging parameter to enable datasource logging,so that I can check no of connections acquired,released and unused in logs ? How to check oracle connection limit for a particular user ?

    Read the article

  • Magento MAGMI: Product attributes (custom options) not showing up in import

    - by Rodgers and Hammertime
    When importing a CSV into Magento with the MAGMI importing tool, I am unable to import Custom Options (as in size: smalee/medium/large). The import manages to put in the basic products, but the Custom Options don't transfer accross. By custom options I mean the fields Title, Input Type, Is Required, Sort Order Title, Price, Price Type, SKU, Sort Order Title, Price, Price Type, SKU, Sort Order Title, Price, Price Type, SKU, Sort Order and so on ... Found in the custom options menu... Even using the example CSV from the MAGMI SourceForge Wiki: sku,name,description,price,Size:drop_down:1 T-Shirt1,T-Shirt,A T-Shirt,5.00,Small|Medium|Large T-Shirt2,T-Shirt2,Another T-Shirt,6.00,XS|S|M|L|XL ...it fails to import the attributes. So i'm simply using MAGMI with the supplied example data from SourceForge on a blank magento product list, and it doesn't transfer properly. Can anyone shed any light on what might be wrong? I am using Magento ver. 1.6.1.0 if that changes anything. Thanks.

    Read the article

  • How can I select values from different rows depending on the most recent entry date, all for the sam

    - by user321185
    Basically I have a table which is used to hold employee work wear details. It is formed of the columns: EmployeeID, CostCentre, AssociateLevel, IssueDate, TrouserSize, TrouserLength, TopSize & ShoeSize. An employee can be assigned a pair of trousers, a top and shoes at the same time or only one or two pieces of clothing. As we all know peoples sizes and employee levels can change which is why I need help really. Different types of employees (associatelevels) require different colours of clothing but you can ignore this part. Everytime an employee receives an item of clothing a new row will be inserted into the table with an input date. I need to be able to select the most recent clothes size for each item of clothing for each employee. It is not necessary for all the columns to hold values because an employee could receive trousers or poloshirts at different times in the year. So for example if employee '54664LSS' was given a pair of 'XL' trousers and a 'L' top on 24/03/11 but then received a 'M' top on 26/05/10. The input of these items would be help on two different rows obviously. So if I wanted to select the most recent clothing for each clothes category. Then the values of the 'M' sized top and the 'L' sized trousers would need to be returned. Any help would be greatly appreciated as I'm pretty stuck :(. Thanks.

    Read the article

  • How to effectively use WorkbookBeforeClose event correctly?

    - by Ahmad
    On a daily basis, a person needs to check that specific workbooks have been correctly updated with Bloomberg and Reuters market data ie. all data has pulled through and that the 'numbers look correct'. In the past, people were not checking the 'numbers' which led to inaccurate uploads to other systems etc. The idea is that 'something' needs to be developed to prevent the use from closing/saving the workbook unless he/she has checked that the updates are correct/accurate. The numbers look correct action is purely an intuitive exercise, thus will not be coded in any way. The simple solution was to prompt users prior to closing the specific workbook to verify that the data has been checked. Using VSTO SE for Excel 2007, an Add-in was created which hooks into the WorkbookBeforeClose event which is initialised in the add-in ThisAddIn_Startup private void wb_BeforeClose(Xl.Workbook wb, ref bool cancel) { //.... snip ... if (list.Contains(wb.Name)) { DailogResult result = MessageBox.Show("some message", "sometitle", MessageBoxButtons.YesNo); if (result != DialogResult.Yes) { cancel = true; // i think this prevents the whole application from closing } } } I have found the following ThisApplication.WorkbookBeforeSave vs ThisWorkbook.Application.WorkbookBeforeSave which recommends that one should use the ThisApplication.WorkbookBeforeClose event which I think is what I am doing since will span all files opened. The issue I have with the approach is that assuming that I have several files open, some of which are in my list, the event prevents Excel from closing all files sequentially. It now requires each file to be closed individually. Am I using the event correctly and is this effective & efficient use of the event? Should I use the Application level event or document level event? Is there a way to prevent the above behaviour? Any other suggestions are welcomed VS 2005 with VSTO SE

    Read the article

  • When is a>a true ?

    - by Cricri
    Right, I think I really am living a dream. I have the following piece of code which I compile and run on an AIX machine: AIX 3 5 PowerPC_POWER5 processor type IBM XL C/C++ for AIX, V10.1 Version: 10.01.0000.0003 #include <stdio.h> #include <math.h> #define RADIAN(x) ((x) * acos(0.0) / 90.0) double nearest_distance(double radius,double lon1, double lat1, double lon2, double lat2){ double rlat1=RADIAN(lat1); double rlat2=RADIAN(lat2); double rlon1=lon1; double rlon2=lon2; double a=0,b=0,c=0; a = sin(rlat1)*sin(rlat2)+ cos(rlat1)*cos(rlat2)*cos(rlon2-rlon1); printf("%lf\n",a); if (a > 1) { printf("aaaaaaaaaaaaaaaa\n"); } b = acos(a); c = radius * b; return radius*(acos(sin(rlat1)*sin(rlat2)+ cos(rlat1)*cos(rlat2)*cos(rlon2-rlon1))); } int main(int argc, char** argv) { nearest_distance(6367.47,10,64,10,64); return 0; } Now, the value of 'a' after the calculation is reported as being '1'. And, on this AIX machine, it looks like 1 1 is true as my 'if' is entered !!! And my acos of what I think is '1' returns NanQ since 1 is bigger than 1. May I ask how that is even possible ? I do not know what to think anymore ! The code works just fine on other architectures where 'a' really takes the value of what I think is 1 and acos(a) is 0.

    Read the article

  • implementing cryptographic algorithms, specifically the key expansion part

    - by masseyc
    Hey, recently I picked up a copy of Applied Cryptography by Bruce Schneier and it's been a good read. I now understand how several algorithms outlined in the book work, and I'd like to start implementing a few of them in C. One thing that many of the algorithms have in common is dividing an x-bit key, into several smaller y-bit keys. For example, blowfish's key, X, is 64-bits, but you are required to break it up into two 32-bit halves; Xl and Xr. This is where I'm getting stuck. I'm fairly decent with C, but I'm not the strongest when it comes to bitwise operators and the like. After some help on IRC, I managed to come up with these two macros: #define splitup(a, b, c) {b = a >> 32; c = a & 0xffffffff; } #define combine(a, b, c) {a = (c << 32) | a;} Where a is 64 bits and b and c are 32 bits. However, the compiler warns me about the fact that I'm shifting a 32 bit variable by 32 bits. My questions are these: what's bad about shifting a 32-bit variable 32 bits? I'm guessing it's undefined, but these macros do seem to be working. Also, would you suggest I go about this another way? As I said, I'm fairly familiar with C, but bitwise operators and the like still give me a headache.

    Read the article

  • C++ compilers and back/front ends

    - by aaa
    Hello. for my own education I am curious what compilers use which C++ front-end and backend. Can you enlighten me where the following technologies are used and what hallmarks/advantages they have if any? Open64 - is it backend, front-end, or both? Which compilers use it? I encounter it in cuda compiler. EDG - as far as I can tell this is a backend use by Intel compilers and Comeau. do other compilers use it? I found quite a few references to it in boost source code. ANTLR - this is general parser. Do any common compilers use it? Regarding compilers: with front-end/backend does gcc compiler suite uses? does it have common heritage with any other compiler? what front-end/backend PGI and PathScale compilers use? what front-end/backend XL compiler uses (IBM offering). Thanks.

    Read the article

  • Template access of symbol in unnamed namespace

    - by Fred Larson
    We are upgrading our XL C/C++ compiler from V8.0 to V10.1 and found some code that is now giving us an error, even though it compiled under V8.0. Here's a minimal example: test.h: #include <iostream> #include <string> template <class T> void f() { std::cout << TEST << std::endl; } test.cpp: #include <string> #include "test.h" namespace { std::string TEST = "test"; } int main() { f<int>(); return 0; } Under V10.1, we get the following error: "test.h", line 7.16: 1540-0274 (S) The name lookup for "TEST" did not find a declaration. "test.cpp", line 6.15: 1540-1303 (I) "std::string TEST" is not visible. "test.h", line 5.6: 1540-0700 (I) The previous message was produced while processing "f<int>()". "test.cpp", line 11.3: 1540-0700 (I) The previous message was produced while processing "main()". We found a similar difference between g++ 3.3.2 and 4.3.2. I also found in g++, if I move the #include "test.h" to be after the unnamed namespace declaration, the compile error goes away. So here's my question: what does the Standard say about this? When a template is instantiated, is that instance considered to be declared at the point where the template itself was declared, or is the standard not that clear on this point? I did some looking though the n2461.pdf draft, but didn't really come up with anything definitive.

    Read the article

  • Syncronizing XML file with MySQL database

    - by Fred K
    My company uses an internal management software for storing products. They want to transpose all the products in a MySql database so they can do available their products on the company website. Notice: they will continue to use their own internal software. This software can exports all the products in various file format (including XML). The syncronization not have to be in real time, they are satisfied to syncronize the MySql database once a day (late night). Also, each product in their software has one or more images, then I have to do available also the images on the website. Here is an example of an XML export: <?xml version="1.0" encoding="UTF-8"?> <export_management userid="78643"> <product id="1234"> <version>100</version> <insert_date>2013-12-12 00:00:00</insert_date> <warrenty>true</warrenty> <price>139,00</price> <model> <code>324234345</code> <model>Notredame</model> <color>red</color> <size>XL</size> </model> <internal> <color>green</color> <size>S</size> </internal> <options> <s_option>some option</standard_option> <s_option>some option</standard_option> <extra_option>some option</extra> <extra_option>some option</extra> </options> <images>   </images> </product> </export_management> Some ideas for how can I do it? Or if you have better ideas to do that.

    Read the article

  • XBRL - Moving from Production to Consumption

    - by jmorourke
    Here's an update on what’s new with XBRL and how it can actually benefit your organization versus adding extra time and costs to financial reporting.  On February 29th (leap day) of 2012 I attended the XBRL and Financial Analysis Technology Conference at Baruch College in NYC.  The event, which attracted over 300 XBRL gurus and fans was presented by XBRL US, The New York Society of Security Analysts’ Improved Corporate Reporting Committee, and Baruch College’s Robert Zicklin Center for Corporate Integrity.  The event featured keynotes from the U.S. Securities and Exchange Commission (SEC), and the CFA Institute as well as panels covering alternative research tools and data, corporate reporting to stakeholders and a demonstration of XBRL analysis tools.  The program culminated in a presentation of the finalists and the winner of the $20,000 XBRL Challenge.    Some of the key points made in the sessions included: The focus of XBRL tools is moving from production to consumption. As of February 2012, over 9000 companies are reporting in XBRL, with over 10 million facts filed to date XBRL taxonomy extensions have dropped from 27% to 11% making comparisons easier The SEC reports that XBRL makes it easier to analyze disclosures, focus on accounting issues XBRL is helping standards-setters like the FASB speed their analysis of impacts of proposed accounting rule changes Companies like Thomson Reuters report that XBRL is helping speed the delivery of data to clients The most interesting part of the program though, was the session highlighting the 5 finalists in the XBRL Challenge competition and the winning solution.  The XBRL Challenge was launched in 2011 as a means of spurring the development of more end-user tools to help with the consumption of XBRL-based financial information.       Over an 8-month process handled by 5 judges, there were 84 registrants, 15 completed submissions, 5 finalists and one winner of the challenge.  All of the solutions are open-sourced tools and most of them focus on consuming XBRL-based data.  The 5 finalists included: Advanced XBRL Processing from Oxide solutions – XBRL viewer for taxonomies, filings and company data with peer comparison capabilities. Arrelle – API for XBRL processes, supports SEC Validations, RSS Feeds to access filings etc. Calcbench – XBRL data analysis tool that can be embedded in other web applications.  This tool can combine XBRL filings with real-time market data. XBRL to XL – allows the importing of XBRL data into Microsoft Excel for analysis, comparisons.  Users start on the web and populate Excel with XBRL data. XBurble – allows users to search and view XBRL filings, export to Excel, merge for comparison, and includes a workflow interface. The winner of the $20,000 XBRL Challenge prize was CalcBench.  More information about the XBRL Challenge and the finalists can be found at www.XBRLUS.org/challenge XBRL for Sustainability Reporting – other recent news on the XBRL front was the announcement by the Global Reporting Initiative (GRI) of an XBRL taxonomy for Sustainability Reporting.  This taxonomy was co-developed by the GRI and Deloitte and is designed to make the consumption of data found in Sustainability Reports much easier.  Although there is no government mandate to file Sustainability Reports in XBRL format, organizations that do use the GRI guidelines for Sustainability Reporting are encouraged to tag and submit their data voluntarily to the GRI – who will populate a database with Sustainability Reporting data and make this available to the public.  For more information about this initiative, you can go to the GRI web site:  www.globalreporting.org. So how does all of this benefit corporate filers and investors?  Since its introduction, the consensus in the market is that XBRL has mainly benefited the regulators and investment analysts who need to consume and analyze large volumes of financial data.  But with the emergence of more end-user tools for consuming and analyzing XBRL-based data, and the ability to perform quick comparisons of one company versus its peers and competitors in an industry group, will soon accelerate the benefits to corporate finance staff, as well as individual investors.  This could apply to financial results tagged in XBRL, as well as non-financial information such as Sustainability Reporting – which over the long-term will likely be integrated with financial reporting.   And as multiple regulators and agencies in a country adopt the XBRL standard for corporate filings, more benefits will accrue as companies will be able to leverage one set of XBRL-based financial data for multiple regulatory filings.     For more information about the latest developments in XBRL, check out the XBRL US or XBRL International web sites:  www.xbrl.org, www.xbrlus.org. For more information about what Oracle is doing to support XBRL, here are some links: http://www.oracle.com/us/solutions/ent-performance-bi/disclosure-management-065892.html http://www.oracle.com/technetwork/database/features/xmldb/index-087631.html Feel free to contact me if you have any questions or need more information:  [email protected]

    Read the article

  • For a .NET winforms datagridview I would like a combobox column to have a different set of values for each row.

    - by Seth Spearman
    Hello, I have a DataGridView that I am binding to a POCO. I have the databinding working fine. However, I have added a combobox column that I want to be different for each row. Specifically, I have a grid of purchased items, some of which have sizes (like Adult XL, Adult L) and other items are not sized (like Car Magnet.) So essentially what I want to change is the DATA SOURCE for a combobox column in the data grid. Can that be done? What event can I hook into that would allow me to change properties of certain columns FOR EACH ROW? An acceptable alternative is to change a property when the user clicks or tabs into the row. What event is that? Seth EDIT I need more help with this question. With Triduses help I am SO close but I need a bit more information. First, per the question, is the CellFormatting event really the best/only event for changing the DataSource for a combo box column. I ask because I am doing something rather resource/data intensive, not merely formatting the cell. Second, the cellformatting event is being called just by having the mouse hover over the cell. I tried to set the FormattingApplied property inside my if-block and then I check for it in the if- test but that is returning a weird error message. My ideal situation is that I would apply change the data source for the combo box once for each row and then be done with it. Finally, in order to set the data source of the combobox colunm I have to be able to cast the Cell inside my if block to a type of DataGridViewComboBoxColumn so that I can fill it with rows or set the datasource or something. Here is the code I have right now. Private Sub ProductsDataGrid_CellFormatting(ByVal sender As System.Object, ByVal e As System.Windows.Forms.DataGridViewCellFormattingEventArgs) Handles ProductsDataGrid.CellFormatting If e.ColumnIndex = ProductsDataGrid.Columns("SizeDGColumn").Index Then ' AndAlso Not e.FormattingApplied Then Dim product As LeagueOrderProductInfo = DirectCast(ProductsDataGrid.Rows(e.RowIndex).DataBoundItem, LeagueOrderProductInfo) Dim sizes As LeagueOrderProductSizeList = product.ProductSizes sizes.RemoveSizeFromList(_parentOrderDetail.SizeID) 'WHAT DO I DO HERE TO FILL THE COMBOBOX COLUMN WITH THE sizes collection. End If End Sub Please help. I am completely stuck and this task item should have taken an hour and I am 4+ hours in now. BTW, I am also open to resolving this by taking a completely different direction with it (as long as I can be done quickly.) Seth

    Read the article

  • Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/xmlbeans/XmlException

    - by Dheeraj kumar
    Hi, I have to read xls file in java.I used poi-3.6 to read xls file in Eclipse.But i m getting this ERROR"Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/xmlbeans/XmlException at ReadExcel2.main(ReadExcel2.java:38)". I have added following jars 1)poi-3.6-20091214.jar 2)poi-contrib-3.6-20091214.jar 3)poi-examples-3.6-20091214.jar 4)poi-ooxml-3.6-20091214.jar 5)poi-ooxml-schemas-3.6-20091214.jar 6)poi-scratchpad-3.6-20091214.jar Below is the code which i m using: import org.apache.poi.ss.usermodel.Workbook; import org.apache.poi.ss.usermodel.Sheet; import org.apache.poi.ss.usermodel.Row; import org.apache.poi.ss.usermodel.Cell; import org.apache.poi.xssf.usermodel.XSSFWorkbook; import org.apache.poi.xssf.usermodel.XSSFCell; import org.apache.poi.xssf.usermodel.XSSFRow; import java.io.FileInputStream; import java.io.IOException; import java.util.Iterator; import java.util.List; import java.util.ArrayList; public class ReadExcel { public static void main(String[] args) throws Exception { // // An excel file name. You can create a file name with a full path // information. // String filename = "C:\\myExcel.xl"; // // Create an ArrayList to store the data read from excel sheet. // List sheetData = new ArrayList(); FileInputStream fis = null; try { // // Create a FileInputStream that will be use to read the excel file. // fis = new FileInputStream(filename); // // Create an excel workbook from the file system. // // HSSFWorkbook workbook = new HSSFWorkbook(fis); Workbook workbook = new XSSFWorkbook(fis); // // Get the first sheet on the workbook. // Sheet sheet = workbook.getSheetAt(0); // // When we have a sheet object in hand we can iterator on each // sheet's rows and on each row's cells. We store the data read // on an ArrayList so that we can printed the content of the excel // to the console. // Iterator rows = sheet.rowIterator(); while (rows.hasNext()) { Row row = (XSSFRow) rows.next(); Iterator cells = row.cellIterator(); List data = new ArrayList(); while (cells.hasNext()) { Cell cell = (XSSFCell) cells.next(); data.add(cell); } sheetData.add(data); } } catch (IOException e) { e.printStackTrace(); } finally { if (fis != null) { fis.close(); } } showExelData(sheetData); } private static void showExelData(List sheetData) { // // Iterates the data and print it out to the console. // for (int i = 0; i < sheetData.size(); i++) { List list = (List) sheetData.get(i); for (int j = 0; j < list.size(); j++) { Cell cell = (XSSFCell) list.get(j); System.out.print(cell.getRichStringCellValue().getString()); if (j < list.size() - 1) { System.out.print(", "); } } System.out.println(""); } } } Please help. thanks in anticipation, Regards, Dheeraj!

    Read the article

  • Padding error - when using AES Encryption in Java and Decryption in C

    - by user234445
    Hi All, I have a problem while decrypting the xl file in rijndael 'c' code (The file got encrypted in Java through JCE) and this problem is happening only for the excel files types which having formula's. Remaining all file type encryption/decryption is happening properly. (If i decrypt the same file in java the output is coming fine.) While i am dumped a file i can see the difference between java decryption and 'C' file decryption. od -c -b filename(file decrypted in C) 0034620 005 006 \0 \0 \0 \0 022 \0 022 \0 320 004 \0 \0 276 4 005 006 000 000 000 000 022 000 022 000 320 004 000 000 276 064 0034640 \0 \0 \0 \0 \f \f \f \f \f \f \f \f \f \f \f \f 000 000 000 000 014 014 014 014 014 014 014 014 014 014 014 014 0034660 od -c -b filename(file decrypted in Java) 0034620 005 006 \0 \0 \0 \0 022 \0 022 \0 320 004 \0 \0 276 4 005 006 000 000 000 000 022 000 022 000 320 004 000 000 276 064 0034640 \0 \0 \0 \0 000 000 000 000 0034644 (the above is the difference between the dumped files) The following java code i used to encrypt the file. public class AES { /** * Turns array of bytes into string * * @param buf Array of bytes to convert to hex string * @return Generated hex string */ public static void main(String[] args) throws Exception { File file = new File("testxls.xls"); byte[] lContents = new byte[(int) file.length()]; try { FileInputStream fileInputStream = new FileInputStream(file); fileInputStream.read(lContents); } catch (FileNotFoundException e) { e.printStackTrace(); } catch (IOException e1) { e1.printStackTrace(); } try { KeyGenerator kgen = KeyGenerator.getInstance("AES"); kgen.init(256); // 192 and 256 bits may not be available // Generate the secret key specs. SecretKey skey = kgen.generateKey(); //byte[] raw = skey.getEncoded(); byte[] raw = "aabbccddeeffgghhaabbccddeeffgghh".getBytes(); SecretKeySpec skeySpec = new SecretKeySpec(raw, "AES"); Cipher cipher = Cipher.getInstance("AES"); cipher.init(Cipher.ENCRYPT_MODE, skeySpec); byte[] encrypted = cipher.doFinal(lContents); cipher.init(Cipher.DECRYPT_MODE, skeySpec); byte[] original = cipher.doFinal(lContents); FileOutputStream f1 = new FileOutputStream("testxls_java.xls"); f1.write(original); } catch (Exception e) { // TODO Auto-generated catch block e.printStackTrace(); } } } I used the following file for decryption in 'C'. #include <stdio.h> #include "rijndael.h" #define KEYBITS 256 #include <stdio.h> #include "rijndael.h" #define KEYBITS 256 int main(int argc, char **argv) { unsigned long rk[RKLENGTH(KEYBITS)]; unsigned char key[KEYLENGTH(KEYBITS)]; int i; int nrounds; char dummy[100] = "aabbccddeeffgghhaabbccddeeffgghh"; char *password; FILE *input,*output; password = dummy; for (i = 0; i < sizeof(key); i++) key[i] = *password != 0 ? *password++ : 0; input = fopen("doc_for_logu.xlsb", "rb"); if (input == NULL) { fputs("File read error", stderr); return 1; } output = fopen("ori_c_res.xlsb","w"); nrounds = rijndaelSetupDecrypt(rk, key, 256); while (1) { unsigned char plaintext[16]; unsigned char ciphertext[16]; int j; if (fread(ciphertext, sizeof(ciphertext), 1, input) != 1) break; rijndaelDecrypt(rk, nrounds, ciphertext, plaintext); fwrite(plaintext, sizeof(plaintext), 1, output); } fclose(input); fclose(output); }

    Read the article

  • Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/xmlbeans/XmlException

    - by Dheeraj kumar
    I have to read xls file in java.I used poi-3.6 to read xls file in Eclipse.But i m getting this ERROR"Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/xmlbeans/XmlException at ReadExcel2.main(ReadExcel2.java:38)". I have added following jars 1)poi-3.6-20091214.jar 2)poi-contrib-3.6-20091214.jar 3)poi-examples-3.6-20091214.jar 4)poi-ooxml-3.6-20091214.jar 5)poi-ooxml-schemas-3.6-20091214.jar 6)poi-scratchpad-3.6-20091214.jar Below is the code which i m using: import org.apache.poi.ss.usermodel.Workbook; import org.apache.poi.ss.usermodel.Sheet; import org.apache.poi.ss.usermodel.Row; import org.apache.poi.ss.usermodel.Cell; import org.apache.poi.xssf.usermodel.XSSFWorkbook; import org.apache.poi.xssf.usermodel.XSSFCell; import org.apache.poi.xssf.usermodel.XSSFRow; import java.io.FileInputStream; import java.io.IOException; import java.util.Iterator; import java.util.List; import java.util.ArrayList; public class ReadExcel { public static void main(String[] args) throws Exception { // // An excel file name. You can create a file name with a full path // information. // String filename = "C:\\myExcel.xl"; // // Create an ArrayList to store the data read from excel sheet. // List sheetData = new ArrayList(); FileInputStream fis = null; try { // // Create a FileInputStream that will be use to read the excel file. // fis = new FileInputStream(filename); // // Create an excel workbook from the file system. // // HSSFWorkbook workbook = new HSSFWorkbook(fis); Workbook workbook = new XSSFWorkbook(fis); // // Get the first sheet on the workbook. // Sheet sheet = workbook.getSheetAt(0); // // When we have a sheet object in hand we can iterator on each // sheet's rows and on each row's cells. We store the data read // on an ArrayList so that we can printed the content of the excel // to the console. // Iterator rows = sheet.rowIterator(); while (rows.hasNext()) { Row row = (XSSFRow) rows.next(); Iterator cells = row.cellIterator(); List data = new ArrayList(); while (cells.hasNext()) { Cell cell = (XSSFCell) cells.next(); data.add(cell); } sheetData.add(data); } } catch (IOException e) { e.printStackTrace(); } finally { if (fis != null) { fis.close(); } } showExelData(sheetData); } private static void showExelData(List sheetData) { // // Iterates the data and print it out to the console. // for (int i = 0; i < sheetData.size(); i++) { List list = (List) sheetData.get(i); for (int j = 0; j < list.size(); j++) { Cell cell = (XSSFCell) list.get(j); System.out.print(cell.getRichStringCellValue().getString()); if (j < list.size() - 1) { System.out.print(", "); } } System.out.println(""); } } } Please help. thanks in anticipation, Regards, Dheeraj!

    Read the article

  • CodePlex Daily Summary for Wednesday, December 29, 2010

    CodePlex Daily Summary for Wednesday, December 29, 2010Popular ReleasesDocX: DocX v1.0.0.11: Building Examples projectTo build the Examples project, download DocX.dll and add it as a reference to the project. OverviewThis version of DocX contains many bug fixes, it is a serious step towards a stable release. Added1) Unit testing project, 2) Examples project, 3) To many bug fixes to list here, see the source code change list history.Cosmos (C# Open Source Managed Operating System): 71406: This is the second release supporting the full line of Visual Studio 2010 editions. Changes since release 71246 include: Debug info is now stored in a single .cpdb file (which is a Firebird database) Keyboard input works now (using Console.ReadLine) Console colors work (using Console.ForegroundColor and .BackgroundColor)AutoLoL: AutoLoL v1.5.0: Added the all new Masteries Browser which replaces the Quick Open combobox AutoLoL will now attemt to create file associations for mastery (*.lolm) files Each Mastery Build can now contain keywords that the Masteries Browser will use for filtering Changed the way AutoLoL detects if another instance is already running Changed the format of the mastery files to allow more information stored in* Dialogs will now focus the Ok or Cancel button which allows the user to press Return to clo...Confree for Outlook: Confree for Outlook 1.0: Confree for OutlookFeaturesCreate a Claro/Telmex conference directly from Outlook (only works for Argentina). NotesBy now, only works with Claro/Telmex Argentina (if you are using http://simon.telmex.net.ar, we are good).Paint.NET PSD Plugin: 1.6.0: Handling of layer masks has been greatly improved. Improved reliability. Many PSD files that previously loaded in as garbage will now load in correctly. Parallelized loading. PSD files containing layer masks will load in a bit quicker thanks to the removal of the sequential bottleneck. Hidden layers are no longer made visible on save. Many thanks to the users who helped expose the layer masks problem: Rob Horowitz, M_Lyons10. Please keep sending in those bug reports and PSD repro files!Razor Templating Engine: Razor Templating Engine v1.2: Changes: ADDED: Standard namespaces imports for all templates: System, System.Collections.Generic, System.Linq (Changeset 5635) ADDED: Methods for Precompilation (Changeset 3283) CHANGED: Refactored precompilation to be exposed per-TemplateService. (Changeset 3440) CHANGED: Added more descriptive compilation exception message. (Changeset 3629) FIXED: Forced reference to Microsoft.CSharp to correct support for testing frameworks. (Changeset 3689) FIXED: Added support for nested anonymous obj...PhysicalMeasure C# library: PhysicalMeasure 1.0 Release 2010-12-28: PhysicalMeasure 1.0 Release 2010-12-28Facebook C# SDK: 4.1.1: From 4.1.1 Release: Authentication bug fix caused by facebook change (error with redirects in Safari) Authenticator fix, always returning true From 4.1.0 Release Lots of bug fixes Removed Dynamic Runtime Language dependencies from non-dynamic platforms. Samples included in release for ASP.NET, MVC, Silverlight, Windows Phone 7, WPF, WinForms, and one Visual Basic Sample Changed internal serialization to use Json.net BREAKING CHANGE: Canvas Session is no longer supported. Use Signed...MonitorWang: MonitorWang v1.0.5 (Growler): What's new?Added Growl Notification Finalisers - these are interceptor components that work exclusively with the Growl Publisher. These allow you to modify the Growl Notification just prior to it being sent by the publisher. You can inject custom logic to precisely control how the Growl Notification will appear; this includes changing the Growl Priority level and message text. I've created to two Growl Notification Finalisers - one allows you to change the Growl Notification Priorty based on ...Catel - WPF and Silverlight MVVM library: 1.0.0: And there it is, the final release of Catel, and it is no longer a beta version!EnhSim: EnhSim 2.2.7 ALPHA: 2.2.7 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Mongoose has bee...LINQ to Twitter: LINQ to Twitter Beta v2.0.19: Mono 2.8, Silverlight, OAuth, 100% Twitter API coverage, streaming, extensibility via Raw Queries, and added documentation. Bug fixes.Rocket Framework (.Net 4.0): Rocket Framework for Windows V 1.0.0: Architecture is reviewed and adjusted in a way so that I can introduce the Web version and WPF version of this framework next. - Rocket.Core is introduced - Controller button functions revisited and updated - DB is renewed to suite the implemented features - Create New button functionality is changed - Add Question Handling featuresFlickr Wallpaper Rotator (for Windows desktop): Wallpaper Flickr 1.1: Some minor bugfixes (mostly covering when network connection is flakey, so I discovered them all while at my parents' house for Christmas).NoSimplerAccounting: NoSimplerAccounting 6.0: -Fixed a bug in expense category report.NHibernate Mapping Generator: NHibernate Mapping Generator 2.0: Added support for Postgres (Thanks to Angelo)NewLife XCode: XCode v6.5.2010.1223 ????(????v3.5??): XCode v6.5.2010.1223 ????,??: NewLife.Core ??? NewLife.Net ??? XControl ??? XTemplate ????,??C#?????? XAgent ???? NewLife.CommonEnitty ??????(???,XCode??????) XCode?? ?????????,??????????????????,?????95% XCode v3.5.2009.0714 ??,?v3.5?v6.0???????????????,?????????。v3.5???????????,??????????????。 XCoder ??XTemplate?????????,????????XCode??? XCoder_Src ???????(????XTemplate????),??????????????????VivoSocial: VivoSocial 7.4.0: Please see changes: http://support.vivoware.com/project/ChangeLog.aspx?PROJID=48Umbraco CMS: Umbraco 4.6 Beta - codename JUNO: The Umbraco 4.6 beta (codename JUNO) release contains many new features focusing on an improved installation experience, a number of robust developer features, and contains more than 89 bug fixes since the 4.5.2 release. Improved installer experience Updated Starter Kits (Simple, Blog, Personal, Business) Beautiful, free, customizable skins included Skinning engine and Skin customization (see Skinning Documentation Kit) Default dashboards on install with hide option Updated Login t...ASP.NET MVC SiteMap provider: MvcSiteMapProvider 2.3.0: Using NuGet?MvcSiteMapProvider is also listed in the NuGet feed. Learn more... Like the project? Consider a donation!Donate via PayPal via PayPal. Release notesThis will be the last release targeting ASP.NET MVC 2 and .NET 3.5. MvcSiteMapProvider 3.0.0 will be targeting ASP.NET MVC 3 and .NET 4 Web.config setting skipAssemblyScanOn has been deprecated in favor of excludeAssembliesForScan and includeAssembliesForScan ISiteMapNodeUrlResolver is now completely responsible for generating th...New Projects42 Sight Holistic Data Warehousing on Microsoft SQL Server 2008: This project is the Holistic DW Template. Included are the 3 MS XL spreadsheet reporting models as described in the book Holistic Data Warehousing. This Template is multi-purpose and requires no modification other than you populating it according to the book examplesAndshev.OrmTools: Tools for generating SQL queries using LINQ. ORM framework is also planned to be implemented.BFG and Dice Roller: This solution will contain 2 concepts, the Dice Roller with hooks for extension into other applications, as well as a Personal Battlefleet Gothic Helper application to help manage, control, and smoothe the flow of BFG games. The BFG application will target the Phone 7 platform.BlondieODOS: its an osBluehat Community projetcts: Bluehat Community projetcts.C# - DBWrapper MySQL: Abro aqui a discussão para criarmos uma classe para manipular dados em MySQL.CardGame for Education: This project goal is to create a small Card Game and to teach the team members how to design, implement and how to use "real world" tools. Englishlearning: My test project!Faran Silver: Silverlight Controls Developed till now. Include: 1- Persian Date Input 2- Close-able Tab Item 3- Shamsi Date Convertergocodelibrary: This is my project.Id3Stuff: Project that throws together some nice id3 mass editing functionsInfoStrat.OakFx: OakFx by InfoStratinivi: iniviJJODESK: This little utility software written in Vb.net 2.0, allows you to : - switch IE proxy, - manage IP, - launch website, - launch other utilty tools , - launch and manage cmd, bat, vbs script. - and much more ..... First I wrote this tool for my personal use, because I work with several networks, and customer sites. I think many other software developpers have the same need. All parameters are customisable, and are store in an ini text file. And it can run on an USB stick. learn MVC: Just keeping all source code in one place. Thats all folksLJF.Utility: Utility?????Lucidity: Lucid WILDMixModes Synergy - The WPF Toolkit: MixModes Synergy 2010 is a complete toolset that simplifies user interface development in Windows Presentation Foundation based applications. Synergy includes rich user interface controls providing professional look and feel for your applications out of the box.Multi Targetted Rss Reader: A Multi Targetted Rss Reader demo that shows how to multi target your view model across different screens (WP7, Silverlight, WPF, Surface).Secure Batch Command Tool For Administrators: A command line tool that will help you to protect your passwords that you should provide in your batch files. This tool encrypt Dos commands and update the batch files in a way that keeps your batch file maintainable and easy to use.SharePoint 2010 Conditional Lookup Control: I created a "Conditional Lookup" control for SharePoint 2010. With this you can connect two lookup fields controls on list forms to fill the second control with values depending on the selected value in the first control. There is a demo inside.sql dot: Ease when making a connection to the sql serverwolisbetter: WOL is better!WP7PasswordsBackup: backup/sync client for windows phone 7 passwords app. ??????: ??????

    Read the article

  • Using ant to register plugins and deploy metadata xmls

    - by Gaurav.gg.goyal
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times","serif"; mso-fareast-font-family:"Times New Roman"; mso-bidi-font-family:"Times New Roman";} Ant can be used to register plugins directly to MDS. Following is the ant script to register plugin zip:<target name="register_plugin" depends="compile_package">    <echo> Register Plugin : ${plugin.base}/${project.name}.zip</echo>    <java classname="oracle.iam.platformservice.utils.PluginUtility" classpathref="classpath" fork="true">        <sysproperty key="XL.HomeDir" value="${oim.home.server}"/>        <sysproperty key="OIM.Username" value="${oim.username}"/>            <sysproperty key="OIM.UserPassword" value="${oim.password}"/>        <sysproperty key="ServerURL" value="${oim.url}"/>       <sysproperty key="PluginZipToRegister" value="${plugin.base}/${project.name}.zip"/>        <sysproperty key="java.security.auth.login.config" value="${oim.home}\designconsole\config\authwl.conf"/>        <arg value="REGISTER"/>        <redirector error="redirector.err" errorproperty="redirector.err" output="redirector.out" outputproperty="redirector.out"/>    </java>    <copy file="${plugin.base}/${project.name}.zip" todir="${oim.home.server}\plugins"/></target> This script requires following properties: plugin.base project.name oim.home.server oim.username oim.password You can either define a properties file for these properties or define them directly in build.xml. Build.properties will look like: # Set the OIM home here oim.home=C:/Oracle/Middleware02/Oracle_IDM # Set the weblogic home here wls.home=C:/Oracle/Middleware02/wlserver_10.3 OIM.ServerName=oim_server1 # e.g.: used in building the jar and zip files #Note : no spaces in the project name project.name=ScheduledTask_Sample #Set the oim username oim.username=xelsysadm # set the oim password oim.password=Welcome1 WL.Username=weblogic WL.UserPassword=weblogic1 #set the oim URL here oim.url=t3://localhost:14000 WL.url=t3://localhost:7001 #Location from where the metadata files are pickedup for MDS import metadata.location=C:/Project /src/ScheduledTask_Sample /metaxml/ Following is the ANT script to import metadata xml: <target name="ImportMetadata">                 <echo> Preparing for MDS xmls Upload...</echo>                 <copy file="${oim.home}/bin/weblogic.properties" todir="."/>                 <replaceregexp file="weblogic.properties" match="wls_servername=(.*)" replace="wls_servername=${OIM.ServerName}" byline="true"/>                <replaceregexp file="weblogic.properties" match="application_name=(.*)" replace="application_name=OIMMetadata" byline="true"/>                <replaceregexp file="weblogic.properties" match="metadata_from_loc=(.*)" replace="metadata_from_loc=${metadata.location}" byline="true"/>                <copy file="${oim.home}/bin/weblogicImportMetadata.py" todir="."/>                 <replace file="weblogicImportMetadata.py">                      <replacefilter token="connect()" value="connect('${wl.username}', '${wl.password}', '${wl.url}')"/>                </replace>                 <echo> Importing metadata xmls to MDS... </echo>                 <exec dir="." vmlauncher="false" executable="${oim.home}/../common/bin/wlst.sh">                         <arg value="-loadProperties"/>                         <arg value="weblogic.properties"/>                         <arg value="weblogicImportMetadata.py"/>                         <redirector output="deletemd_redirector.out" logerror="true" outputproperty="deletemd_redirector.out" />                </exec>                 <echo>${deletemd_redirector.out}</echo>                 <echo>${deletemd_redirector.out}</echo>                 <echo>Completed metadata xmls import to MDS</echo> </target>

    Read the article

  • Informed TDD &ndash; Kata &ldquo;To Roman Numerals&rdquo;

    - by Ralf Westphal
    Originally posted on: http://geekswithblogs.net/theArchitectsNapkin/archive/2014/05/28/informed-tdd-ndash-kata-ldquoto-roman-numeralsrdquo.aspxIn a comment on my article on what I call Informed TDD (ITDD) reader gustav asked how this approach would apply to the kata “To Roman Numerals”. And whether ITDD wasn´t a violation of TDD´s principle of leaving out “advanced topics like mocks”. I like to respond with this article to his questions. There´s more to say than fits into a commentary. Mocks and TDD I don´t see in how far TDD is avoiding or opposed to mocks. TDD and mocks are orthogonal. TDD is about pocess, mocks are about structure and costs. Maybe by moving forward in tiny red+green+refactor steps less need arises for mocks. But then… if the functionality you need to implement requires “expensive” resource access you can´t avoid using mocks. Because you don´t want to constantly run all your tests against the real resource. True, in ITDD mocks seem to be in almost inflationary use. That´s not what you usually see in TDD demonstrations. However, there´s a reason for that as I tried to explain. I don´t use mocks as proxies for “expensive” resource. Rather they are stand-ins for functionality not yet implemented. They allow me to get a test green on a high level of abstraction. That way I can move forward in a top-down fashion. But if you think of mocks as “advanced” or if you don´t want to use a tool like JustMock, then you don´t need to use mocks. You just need to stand the sight of red tests for a little longer ;-) Let me show you what I mean by that by doing a kata. ITDD for “To Roman Numerals” gustav asked for the kata “To Roman Numerals”. I won´t explain the requirements again. You can find descriptions and TDD demonstrations all over the internet, like this one from Corey Haines. Now here is, how I would do this kata differently. 1. Analyse A demonstration of TDD should never skip the analysis phase. It should be made explicit. The requirements should be formalized and acceptance test cases should be compiled. “Formalization” in this case to me means describing the API of the required functionality. “[D]esign a program to work with Roman numerals” like written in this “requirement document” is not enough to start software development. Coding should only begin, if the interface between the “system under development” and its context is clear. If this interface is not readily recognizable from the requirements, it has to be developed first. Exploration of interface alternatives might be in order. It might be necessary to show several interface mock-ups to the customer – even if that´s you fellow developer. Designing the interface is a task of it´s own. It should not be mixed with implementing the required functionality behind the interface. Unfortunately, though, this happens quite often in TDD demonstrations. TDD is used to explore the API and implement it at the same time. To me that´s a violation of the Single Responsibility Principle (SRP) which not only should hold for software functional units but also for tasks or activities. In the case of this kata the API fortunately is obvious. Just one function is needed: string ToRoman(int arabic). And it lives in a class ArabicRomanConversions. Now what about acceptance test cases? There are hardly any stated in the kata descriptions. Roman numerals are explained, but no specific test cases from the point of view of a customer. So I just “invent” some acceptance test cases by picking roman numerals from a wikipedia article. They are supposed to be just “typical examples” without special meaning. Given the acceptance test cases I then try to develop an understanding of the problem domain. I´ll spare you that. The domain is trivial and is explain in almost all kata descriptions. How roman numerals are built is not difficult to understand. What´s more difficult, though, might be to find an efficient solution to convert into them automatically. 2. Solve The usual TDD demonstration skips a solution finding phase. Like the interface exploration it´s mixed in with the implementation. But I don´t think this is how it should be done. I even think this is not how it really works for the people demonstrating TDD. They´re simplifying their true software development process because they want to show a streamlined TDD process. I doubt this is helping anybody. Before you code you better have a plan what to code. This does not mean you have to do “Big Design Up-Front”. It just means: Have a clear picture of the logical solution in your head before you start to build a physical solution (code). Evidently such a solution can only be as good as your understanding of the problem. If that´s limited your solution will be limited, too. Fortunately, in the case of this kata your understanding does not need to be limited. Thus the logical solution does not need to be limited or preliminary or tentative. That does not mean you need to know every line of code in advance. It just means you know the rough structure of your implementation beforehand. Because it should mirror the process described by the logical or conceptual solution. Here´s my solution approach: The arabic “encoding” of numbers represents them as an ordered set of powers of 10. Each digit is a factor to multiply a power of ten with. The “encoding” 123 is the short form for a set like this: {1*10^2, 2*10^1, 3*10^0}. And the number is the sum of the set members. The roman “encoding” is different. There is no base (like 10 for arabic numbers), there are just digits of different value, and they have to be written in descending order. The “encoding” XVI is short for [10, 5, 1]. And the number is still the sum of the members of this list. The roman “encoding” thus is simpler than the arabic. Each “digit” can be taken at face value. No multiplication with a base required. But what about IV which looks like a contradiction to the above rule? It is not – if you accept roman “digits” not to be limited to be single characters only. Usually I, V, X, L, C, D, M are viewed as “digits”, and IV, IX etc. are viewed as nuisances preventing a simple solution. All looks different, though, once IV, IX etc. are taken as “digits”. Then MCMLIV is just a sum: M+CM+L+IV which is 1000+900+50+4. Whereas before it would have been understood as M-C+M+L-I+V – which is more difficult because here some “digits” get subtracted. Here´s the list of roman “digits” with their values: {1, I}, {4, IV}, {5, V}, {9, IX}, {10, X}, {40, XL}, {50, L}, {90, XC}, {100, C}, {400, CD}, {500, D}, {900, CM}, {1000, M} Since I take IV, IX etc. as “digits” translating an arabic number becomes trivial. I just need to find the values of the roman “digits” making up the number, e.g. 1954 is made up of 1000, 900, 50, and 4. I call those “digits” factors. If I move from the highest factor (M=1000) to the lowest (I=1) then translation is a two phase process: Find all the factors Translate the factors found Compile the roman representation Translation is just a look-up. Finding, though, needs some calculation: Find the highest remaining factor fitting in the value Remember and subtract it from the value Repeat with remaining value and remaining factors Please note: This is just an algorithm. It´s not code, even though it might be close. Being so close to code in my solution approach is due to the triviality of the problem. In more realistic examples the conceptual solution would be on a higher level of abstraction. With this solution in hand I finally can do what TDD advocates: find and prioritize test cases. As I can see from the small process description above, there are two aspects to test: Test the translation Test the compilation Test finding the factors Testing the translation primarily means to check if the map of factors and digits is comprehensive. That´s simple, even though it might be tedious. Testing the compilation is trivial. Testing factor finding, though, is a tad more complicated. I can think of several steps: First check, if an arabic number equal to a factor is processed correctly (e.g. 1000=M). Then check if an arabic number consisting of two consecutive factors (e.g. 1900=[M,CM]) is processed correctly. Then check, if a number consisting of the same factor twice is processed correctly (e.g. 2000=[M,M]). Finally check, if an arabic number consisting of non-consecutive factors (e.g. 1400=[M,CD]) is processed correctly. I feel I can start an implementation now. If something becomes more complicated than expected I can slow down and repeat this process. 3. Implement First I write a test for the acceptance test cases. It´s red because there´s no implementation even of the API. That´s in conformance with “TDD lore”, I´d say: Next I implement the API: The acceptance test now is formally correct, but still red of course. This will not change even now that I zoom in. Because my goal is not to most quickly satisfy these tests, but to implement my solution in a stepwise manner. That I do by “faking” it: I just “assume” three functions to represent the transformation process of my solution: My hypothesis is that those three functions in conjunction produce correct results on the API-level. I just have to implement them correctly. That´s what I´m trying now – one by one. I start with a simple “detail function”: Translate(). And I start with all the test cases in the obvious equivalence partition: As you can see I dare to test a private method. Yes. That´s a white box test. But as you´ll see it won´t make my tests brittle. It serves a purpose right here and now: it lets me focus on getting one aspect of my solution right. Here´s the implementation to satisfy the test: It´s as simple as possible. Right how TDD wants me to do it: KISS. Now for the second equivalence partition: translating multiple factors. (It´a pattern: if you need to do something repeatedly separate the tests for doing it once and doing it multiple times.) In this partition I just need a single test case, I guess. Stepping up from a single translation to multiple translations is no rocket science: Usually I would have implemented the final code right away. Splitting it in two steps is just for “educational purposes” here. How small your implementation steps are is a matter of your programming competency. Some “see” the final code right away before their mental eye – others need to work their way towards it. Having two tests I find more important. Now for the next low hanging fruit: compilation. It´s even simpler than translation. A single test is enough, I guess. And normally I would not even have bothered to write that one, because the implementation is so simple. I don´t need to test .NET framework functionality. But again: if it serves the educational purpose… Finally the most complicated part of the solution: finding the factors. There are several equivalence partitions. But still I decide to write just a single test, since the structure of the test data is the same for all partitions: Again, I´m faking the implementation first: I focus on just the first test case. No looping yet. Faking lets me stay on a high level of abstraction. I can write down the implementation of the solution without bothering myself with details of how to actually accomplish the feat. That´s left for a drill down with a test of the fake function: There are two main equivalence partitions, I guess: either the first factor is appropriate or some next. The implementation seems easy. Both test cases are green. (Of course this only works on the premise that there´s always a matching factor. Which is the case since the smallest factor is 1.) And the first of the equivalence partitions on the higher level also is satisfied: Great, I can move on. Now for more than a single factor: Interestingly not just one test becomes green now, but all of them. Great! You might say, then I must have done not the simplest thing possible. And I would reply: I don´t care. I did the most obvious thing. But I also find this loop very simple. Even simpler than a recursion of which I had thought briefly during the problem solving phase. And by the way: Also the acceptance tests went green: Mission accomplished. At least functionality wise. Now I´ve to tidy up things a bit. TDD calls for refactoring. Not uch refactoring is needed, because I wrote the code in top-down fashion. I faked it until I made it. I endured red tests on higher levels while lower levels weren´t perfected yet. But this way I saved myself from refactoring tediousness. At the end, though, some refactoring is required. But maybe in a different way than you would expect. That´s why I rather call it “cleanup”. First I remove duplication. There are two places where factors are defined: in Translate() and in Find_factors(). So I factor the map out into a class constant. Which leads to a small conversion in Find_factors(): And now for the big cleanup: I remove all tests of private methods. They are scaffolding tests to me. They only have temporary value. They are brittle. Only acceptance tests need to remain. However, I carry over the single “digit” tests from Translate() to the acceptance test. I find them valuable to keep, since the other acceptance tests only exercise a subset of all roman “digits”. This then is my final test class: And this is the final production code: Test coverage as reported by NCrunch is 100%: Reflexion Is this the smallest possible code base for this kata? Sure not. You´ll find more concise solutions on the internet. But LOC are of relatively little concern – as long as I can understand the code quickly. So called “elegant” code, however, often is not easy to understand. The same goes for KISS code – especially if left unrefactored, as it is often the case. That´s why I progressed from requirements to final code the way I did. I first understood and solved the problem on a conceptual level. Then I implemented it top down according to my design. I also could have implemented it bottom-up, since I knew some bottom of the solution. That´s the leaves of the functional decomposition tree. Where things became fuzzy, since the design did not cover any more details as with Find_factors(), I repeated the process in the small, so to speak: fake some top level, endure red high level tests, while first solving a simpler problem. Using scaffolding tests (to be thrown away at the end) brought two advantages: Encapsulation of the implementation details was not compromised. Naturally private methods could stay private. I did not need to make them internal or public just to be able to test them. I was able to write focused tests for small aspects of the solution. No need to test everything through the solution root, the API. The bottom line thus for me is: Informed TDD produces cleaner code in a systematic way. It conforms to core principles of programming: Single Responsibility Principle and/or Separation of Concerns. Distinct roles in development – being a researcher, being an engineer, being a craftsman – are represented as different phases. First find what, what there is. Then devise a solution. Then code the solution, manifest the solution in code. Writing tests first is a good practice. But it should not be taken dogmatic. And above all it should not be overloaded with purposes. And finally: moving from top to bottom through a design produces refactored code right away. Clean code thus almost is inevitable – and not left to a refactoring step at the end which is skipped often for different reasons.   PS: Yes, I have done this kata several times. But that has only an impact on the time needed for phases 1 and 2. I won´t skip them because of that. And there are no shortcuts during implementation because of that.

    Read the article

  • Unicenter Software Delivery 4 not able to connect to MS SQL 2000 Database after W2003 SP2 upgrade

    - by grub
    Hello Everyone Yesterday I installed the Windows Server 2003 Service Pack 2 on a Windows Server 2003 which has Unicenter Software Delivery 4 installed. Prior to the installation I disabled every CA service on the server (Brightstor, SDO , RCO, TNG) and the MS SQL 2000 service. After the installation of the SP2 I enabled the services again but the Unicenter Service is not able to connect to the MS SQL 2000 Database anymore. The database itself is up and running and I can connect to it with the Enterprise Manager. A dbcc checkdb doesnt return any errors on the Unicenter database. The Unicenter service throws the following error messages during startup: IM[1] 27/05 10:38:31,272 Installation Manager in init phase IM[1] 27/05 10:38:31,694 Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:31,694 sqls error details: IM[1] 27/05 10:38:31,694 (null) IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError C@:TaskmgrL\ASMTML.CXX:596. IM[1] 27/05 10:38:32,069 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:32,069 sqls error details: IM[1] 27/05 10:38:32,069 (null) IM[1] 27/05 10:38:32,069 returned 0. IM[1] 27/05 10:38:32,084 Persistent Storage could not be opened. Error cause is found in the ASM Event Log. Restart Task Manager. IM[1] 27/05 10:38:32,084 Failed to open database. IM[1] 27/05 10:38:32,084 Installation Manager ends> If I check the Unicenter configutation with *chkmib_l* the tool throws an exception and creates a small dump file. An Exception Occurred: Time: 27/05 09:49:38,928 Reason: ChkMIB_l.exe caused an UNKNOWN_EXCEPTION in module kernel32.dll at 7C82001B:77E4BEE7 Registers: EAX=0012F908 EBX=00000000 ECX=00000000 EDX=02410004 ESI=0012F998 EDI=0012F998 EBP=0012F958 ESP=0012F904 EIP=77E4BEE7 FLG=00000206 CS =7C82001B DS =B90023 SS =120023 ES =120023 FS =7C82003B GS =3F0000 Call Stack: 7C82001B:77E4BEE7 (0xE06D7363 0x00000001 0x00000003 0x0012F98C) kernel32.dll 7C82001B:77BB3259 (0x0012F9B8 0x2B017C50 0x2B024404 0x00B68C98) MSVCRT.dll 7C82001B:2B010C42 (0x00020003 0x010C00FE 0x003F0190 0x00B69050) PS.dll << SOFTWARE DELIVERY INSTANCE INFO >> TRIGGER 0(1) instances: JCE 0(1) instances: TM 0(1) instances: IM 0(1) instances: DM 0(1) instances: DPU 0(71) instances: NATF 0(1) instances: MIBCONV 0(0) instances: API 0(4) instances: DTSFT 0(0) instances: TNGPOP 0(0) instances: DGATE 0(0) instances: << FLUSHING MEMORY TRACES >> << STOP FLUSHING MEMORY TRACES >> I compared the configuration of the SDO service and the system configuration with another server on which the Windows Server 2003 SP2 is installed and SDO is working. The configuration is the same and the same driver and software versions are used. Do you have any idea what causes the connection issue? Should I deinstall the unicenter service and make a fresh installation on the server or should I remove the Windows Server 2003 SP2? I don't want to remove the SP2 because it's a requirement for WSUS3 SP2 and I really don't want to know how many possible exploits are possible in such an old system ;-) Thank you very much and have a nice day. Below you can find more detailed information about the system and the SDO service. psinfo output (system information) System information for \\CZZAAS1003: Uptime: 0 days 14 hours 38 minutes 50 seconds Kernel version: Microsoft Windows Server 2003, Multiprocessor Free Product type: Standard Edition Product version: 5.2 Service pack: 2 Kernel build number: 3790 Install date: 23.9.2004, 11:16:11s IE version: 6.0000 System root: C:\WINDOWS Processors: 2 Processor speed: 2.3 GHz Processor type: Intel(R) Xeon(TM) CPU Physical memory: 1024 MB Video driver: RAGE XL PCI Family (Microsoft Corporation) sdver output (Unicenter Software delivery version) Unicenter Software Delivery 4.0 SP1 I2 ENU [2901] Copyright 2004 Computer Associates International, Incorporated ms sql 2000 version and odbc driver version MS SQL 2000 Server Standard Edition Product Version: 8.00.760 (SP3) ODBC Driver: SQL Server - Version 2000.86.3959.00 complete Unicenter Software delivery service log file TRIGGER[1] 27/05 10:38:28,366 SD Trigger Agent has started NATF[1] 27/05 10:38:28,928 Initiation phase finished IM[1] 27/05 10:38:31,272 Installation Manager in init phase IM[1] 27/05 10:38:31,694 Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:31,694 sqls error details: IM[1] 27/05 10:38:31,694 (null) IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. IM[1] 27/05 10:38:32,069 ##EXCEPTION## TableError C@:TaskmgrL\ASMTML.CXX:596. IM[1] 27/05 10:38:32,069 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process IM(L) - [004152] failed to open database SDDATA. dbopen() call failed. IM[1] 27/05 10:38:32,069 sqls error details: IM[1] 27/05 10:38:32,069 (null) IM[1] 27/05 10:38:32,069 returned 0. IM[1] 27/05 10:38:32,084 Persistent Storage could not be opened. Error cause is found in the ASM Event Log. Restart Task Manager. IM[1] 27/05 10:38:32,084 Failed to open database. IM[1] 27/05 10:38:32,084 Installation Manager ends TM[1] 27/05 10:38:32,116 Task Manager in init phase TM[1] 27/05 10:38:32,334 Process TM(L) - [006132] failed to open database SDDATA. dbopen() call failed. TM[1] 27/05 10:38:32,334 sqls error details: TM[1] 27/05 10:38:32,334 (null) TM[1] 27/05 10:38:32,381 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. TM[1] 27/05 10:38:32,381 ##EXCEPTION## TableError C@:TaskmgrL\ASMTML.CXX:596. TM[1] 27/05 10:38:32,381 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process TM(L) - [006132] failed to open database SDDATA. dbopen() call failed. TM[1] 27/05 10:38:32,381 sqls error details: TM[1] 27/05 10:38:32,381 (null) TM[1] 27/05 10:38:32,381 returned 0. TM[1] 27/05 10:38:32,381 Persistent Storage could not be opened. Error cause is found in the ASM Event Log. Restart Task Manager. TM[1] 27/05 10:38:32,381 Failed to open database. TM[1] 27/05 10:38:32,381 Task Manager ends DM[1] 27/05 10:38:33,272 Dialogue Manager is now active API[1] 27/05 10:38:34,397 API Server Process in init phase API[1] 27/05 10:38:34,397 API - SDNLS_Init API[1] 27/05 10:38:34,397 API - connectEM API[1] 27/05 10:38:34,412 API - apiServ.init DM[1] 27/05 10:38:34,678 **AND** 1 Agents triggered API[1] 27/05 10:38:34,709 Process API(L) - [005680] failed to open database SDDATA. dbopen() call failed. API[1] 27/05 10:38:34,709 sqls error details: API[1] 27/05 10:38:34,709 (null) API[1] 27/05 10:38:34,756 ##EXCEPTION## TableError T@:PS_SQLS\isam_db.cxx:744. API[1] 27/05 10:38:34,756 ##EXCEPTION## TableError C@:MainAPIL\APISERVL.CXX:246. API[1] 27/05 10:38:34,756 ##EXCEPTION## ErrorCode: 4711 in SDDATA:Isam::Isam. Process API(L) - [005680] failed to open database SDDATA. dbopen() call failed. API[1] 27/05 10:38:34,756 sqls error details: API[1] 27/05 10:38:34,756 (null) API[1] 27/05 10:38:34,756 returned 0. API[1] 27/05 10:38:34,756 Open of the database failed. API[1] 27/05 10:38:34,756 API - apiServ.init complete API[1] 27/05 10:38:34,756 API - start_APIServer DM[1] 27/05 10:38:34,803 CZZAAR1037 DPU[1:CZZAAR1037] 27/05 10:38:35,772 DPU in init phase DPU[1:CZZAAR1037] 27/05 10:38:36,100 >> GetManagerData DPU[1:CZZAAR1037] 27/05 10:38:36,287 >> SetCompInfo DPU[1:CZZAAR1037] 27/05 10:38:36,334 >> GetContainerList DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6ad DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6ad DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6b7 DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6b7 DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6c1 DPU[1:CZZAAR1037] 27/05 10:38:36,350 getJobState 3 from 5b6c1 DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6cb DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6cb DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6f9 DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b6f9 DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b71a DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b71a DPU[1:CZZAAR1037] 27/05 10:38:36,366 getJobState 3 from 5b724 DPU[1:CZZAAR1037] 27/05 10:38:36,381 getJobState 3 from 5b724 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b72e DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b72e DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b738 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b738 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b742 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b742 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b74c DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b74c DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b756 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b756 DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b78a DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b78a DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b7af DPU[1:CZZAAR1037] 27/05 10:38:36,397 getJobState 3 from 5b7af DPU[1:CZZAAR1037] 27/05 10:38:36,522 >> SetCompAttr DPU[1:CZZAAR1037] 27/05 10:38:36,569 >> SetDetected DPU[1:CZZAAR1037] 27/05 10:38:36,584 disconnect DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6ad DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6b7 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6c1 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6cb DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b6f9 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b71a DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b724 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b72e DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b738 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b742 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b74c DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b756 DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b78a DPU[1:CZZAAR1037] 27/05 10:38:36,584 getJobState 3 from 5b7af DPU[1:CZZAAR1037] 27/05 10:38:36,584 DPU ends DM[1] 27/05 10:38:38,006 **AND** 0 Agents triggered JCE[1] 27/05 10:38:38,053 JCE starts DM[1] 27/05 10:38:38,287 CZZAAS1003 DPU[2:CZZAAS1003] 27/05 10:38:38,412 DPU in init phase DPU[2:CZZAAS1003] 27/05 10:38:38,647 >> GetManagerData DPU[2:CZZAAS1003] 27/05 10:38:38,756 >> SetCompInfo DPU[2:CZZAAS1003] 27/05 10:38:38,787 >> GetContainerList DM[1] 27/05 10:38:38,850 **AND** 1 Agents triggered DM[1] 27/05 10:38:38,928 CZZAAR1124 DPU[3:CZZAAR1124] 27/05 10:38:39,053 DPU in init phase DPU[3:CZZAAR1124] 27/05 10:38:39,272 >> GetManagerData DM[1] 27/05 10:38:39,334 **AND** 1 Agents triggered DPU[3:CZZAAR1124] 27/05 10:38:39,381 >> SetCompInfo DPU[3:CZZAAR1124] 27/05 10:38:39,412 >> GetContainerList DM[1] 27/05 10:38:39,412 CZZAAR1125 DPU[3:CZZAAR1124] 27/05 10:38:39,428 getJobState 3 from 5b88e DPU[3:CZZAAR1124] 27/05 10:38:39,428 getJobState 3 from 5b88e DPU[2:CZZAAS1003] 27/05 10:38:39,491 >> SetCompAttr DPU[3:CZZAAR1124] 27/05 10:38:39,522 >> SetCompAttr DPU[4:CZZAAR1125] 27/05 10:38:39,522 DPU in init phase DPU[3:CZZAAR1124] 27/05 10:38:39,584 >> SetDetected DPU[2:CZZAAS1003] 27/05 10:38:39,584 >> SetDetected DPU[3:CZZAAR1124] 27/05 10:38:39,584 disconnect DPU[3:CZZAAR1124] 27/05 10:38:39,600 getJobState 3 from 5b88e DPU[3:CZZAAR1124] 27/05 10:38:39,600 DPU ends DPU[2:CZZAAS1003] 27/05 10:38:39,631 disconnect DPU[2:CZZAAS1003] 27/05 10:38:39,631 DPU ends DPU[4:CZZAAR1125] 27/05 10:38:39,756 >> GetManagerData DPU[4:CZZAAR1125] 27/05 10:38:39,850 >> SetCompInfo DPU[4:CZZAAR1125] 27/05 10:38:39,881 >> GetContainerList DPU[4:CZZAAR1125] 27/05 10:38:39,897 getJobState 3 from 5b8a9 DPU[4:CZZAAR1125] 27/05 10:38:39,897 getJobState 3 from 5b8a9 DPU[4:CZZAAR1125] 27/05 10:38:39,991 >> SetCompAttr DPU[4:CZZAAR1125] 27/05 10:38:40,100 >> SetDetected DPU[4:CZZAAR1125] 27/05 10:38:40,116 disconnect DPU[4:CZZAAR1125] 27/05 10:38:40,116 getJobState 3 from 5b8a9 DPU[4:CZZAAR1125] 27/05 10:38:40,116 DPU ends DM[1] 27/05 10:38:40,741 **AND** 0 Agents triggered JCE[1] 27/05 10:38:42,756 JCE ends DM[1] 27/05 10:38:47,475 **AND** 0 Agents triggered DM[1] 27/05 10:38:54,241 **AND** 0 Agents triggered

    Read the article

  • video and file caching with squid lusca?

    - by moon
    hello all i have configured squid lusca on ubuntu 11.04 version and also configured the video caching but the problem is the squid cannot configure the video more than 2 min long and the file of size upto 5.xx mbs only. here is my config please guide me how can i cache the long videos and files with squid: > # PORT and Transparent Option http_port 8080 transparent server_http11 on icp_port 0 > > # Cache Directory , modify it according to your system. > # but first create directory in root by mkdir /cache1 > # and then issue this command chown proxy:proxy /cache1 > # [for ubuntu user is proxy, in Fedora user is SQUID] > # I have set 500 MB for caching reserved just for caching , > # adjust it according to your need. > # My recommendation is to have one cache_dir per drive. zzz > > #store_dir_select_algorithm round-robin cache_dir aufs /cache1 500 16 256 cache_replacement_policy heap LFUDA memory_replacement_policy heap > LFUDA > > # If you want to enable DATE time n SQUID Logs,use following emulate_httpd_log on logformat squid %tl %6tr %>a %Ss/%03Hs %<st %rm > %ru %un %Sh/%<A %mt log_fqdn off > > # How much days to keep users access web logs > # You need to rotate your log files with a cron job. For example: > # 0 0 * * * /usr/local/squid/bin/squid -k rotate logfile_rotate 14 debug_options ALL,1 cache_access_log /var/log/squid/access.log > cache_log /var/log/squid/cache.log cache_store_log > /var/log/squid/store.log > > #I used DNSAMSQ service for fast dns resolving > #so install by using "apt-get install dnsmasq" first dns_nameservers 127.0.0.1 101.11.11.5 ftp_user anonymous@ ftp_list_width 32 ftp_passive on ftp_sanitycheck on > > #ACL Section acl all src 0.0.0.0/0.0.0.0 acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl > to_localhost dst 127.0.0.0/8 acl SSL_ports port 443 563 # https, snews > acl SSL_ports port 873 # rsync acl Safe_ports port 80 # http acl > Safe_ports port 21 # ftp acl Safe_ports port 443 563 # https, snews > acl Safe_ports port 70 # gopher acl Safe_ports port 210 # wais acl > Safe_ports port 1025-65535 # unregistered ports acl Safe_ports port > 280 # http-mgmt acl Safe_ports port 488 # gss-http acl Safe_ports port > 591 # filemaker acl Safe_ports port 777 # multiling http acl > Safe_ports port 631 # cups acl Safe_ports port 873 # rsync acl > Safe_ports port 901 # SWAT acl purge method PURGE acl CONNECT method > CONNECT http_access allow manager localhost http_access deny manager > http_access allow purge localhost http_access deny purge http_access > deny !Safe_ports http_access deny CONNECT !SSL_ports http_access allow > localhost http_access allow all http_reply_access allow all icp_access > allow all > > #========================== > # Administrative Parameters > #========================== > > # I used UBUNTU so user is proxy, in FEDORA you may use use squid cache_effective_user proxy cache_effective_group proxy cache_mgr > [email protected] visible_hostname proxy.aacable.net unique_hostname > [email protected] > > #============= > # ACCELERATOR > #============= half_closed_clients off quick_abort_min 0 KB quick_abort_max 0 KB vary_ignore_expire on reload_into_ims on log_fqdn > off memory_pools off > > # If you want to hide your proxy machine from being detected at various site use following via off > > #============================================ > # OPTIONS WHICH AFFECT THE CACHE SIZE / zaib > #============================================ > # If you have 4GB memory in Squid box, we will use formula of 1/3 > # You can adjust it according to your need. IF squid is taking too much of RAM > # Then decrease it to 128 MB or even less. > > cache_mem 256 MB minimum_object_size 512 bytes maximum_object_size 500 > MB maximum_object_size_in_memory 128 KB > > #============================================================$ > # SNMP , if you want to generate graphs for SQUID via MRTG > #============================================================$ > #acl snmppublic snmp_community gl > #snmp_port 3401 > #snmp_access allow snmppublic all > #snmp_access allow all > > #============================================================ > # ZPH , To enable cache content to be delivered at full lan speed, > # To bypass the queue at MT. > #============================================================ tcp_outgoing_tos 0x30 all zph_mode tos zph_local 0x30 zph_parent 0 > zph_option 136 > > # Caching Youtube acl videocache_allow_url url_regex -i \.youtube\.com\/get_video\? acl videocache_allow_url url_regex -i > \.youtube\.com\/videoplayback \.youtube\.com\/videoplay > \.youtube\.com\/get_video\? acl videocache_allow_url url_regex -i > \.youtube\.[a-z][a-z]\/videoplayback \.youtube\.[a-z][a-z]\/videoplay > \.youtube\.[a-z][a-z]\/get_video\? acl videocache_allow_url url_regex > -i \.googlevideo\.com\/videoplayback \.googlevideo\.com\/videoplay \.googlevideo\.com\/get_video\? acl videocache_allow_url url_regex -i > \.google\.com\/videoplayback \.google\.com\/videoplay > \.google\.com\/get_video\? acl videocache_allow_url url_regex -i > \.google\.[a-z][a-z]\/videoplayback \.google\.[a-z][a-z]\/videoplay > \.google\.[a-z][a-z]\/get_video\? acl videocache_allow_url url_regex > -i proxy[a-z0-9\-][a-z0-9][a-z0-9][a-z0-9]?\.dailymotion\.com\/ acl videocache_allow_url url_regex -i vid\.akm\.dailymotion\.com\/ acl > videocache_allow_url url_regex -i > [a-z0-9][0-9a-z][0-9a-z]?[0-9a-z]?[0-9a-z]?\.xtube\.com\/(.*)flv acl > videocache_allow_url url_regex -i \.vimeo\.com\/(.*)\.(flv|mp4) acl > videocache_allow_url url_regex -i > va\.wrzuta\.pl\/wa[0-9][0-9][0-9][0-9]? acl videocache_allow_url > url_regex -i \.youporn\.com\/(.*)\.flv acl videocache_allow_url > url_regex -i \.msn\.com\.edgesuite\.net\/(.*)\.flv acl > videocache_allow_url url_regex -i \.tube8\.com\/(.*)\.(flv|3gp) acl > videocache_allow_url url_regex -i \.mais\.uol\.com\.br\/(.*)\.flv acl > videocache_allow_url url_regex -i > \.blip\.tv\/(.*)\.(flv|avi|mov|mp3|m4v|mp4|wmv|rm|ram|m4v) acl > videocache_allow_url url_regex -i > \.apniisp\.com\/(.*)\.(flv|avi|mov|mp3|m4v|mp4|wmv|rm|ram|m4v) acl > videocache_allow_url url_regex -i \.break\.com\/(.*)\.(flv|mp4) acl > videocache_allow_url url_regex -i redtube\.com\/(.*)\.flv acl > videocache_allow_dom dstdomain .mccont.com .metacafe.com > .cdn.dailymotion.com acl videocache_deny_dom dstdomain > .download.youporn.com .static.blip.tv acl dontrewrite url_regex > redbot\.org \.php acl getmethod method GET > > storeurl_access deny dontrewrite storeurl_access deny !getmethod > storeurl_access deny videocache_deny_dom storeurl_access allow > videocache_allow_url storeurl_access allow videocache_allow_dom > storeurl_access deny all > > storeurl_rewrite_program /etc/squid/storeurl.pl > storeurl_rewrite_children 7 storeurl_rewrite_concurrency 10 > > acl store_rewrite_list urlpath_regex -i > \/(get_video\?|videodownload\?|videoplayback.*id) acl > store_rewrite_list urlpath_regex -i \.flv$ \.mp3$ \.mp4$ \.swf$ \ > storeurl_access allow store_rewrite_list storeurl_access deny all > > refresh_pattern -i \.flv$ 10080 80% 10080 override-expire > override-lastmod reload-into-ims ignore-reload ignore-no-cache > ignore-private ignore-auth refresh_pattern -i \.mp3$ 10080 80% 10080 > override-expire override-lastmod reload-into-ims ignore-reload > ignore-no-cache ignore-private ignore-auth refresh_pattern -i \.mp4$ > 10080 80% 10080 override-expire override-lastmod reload-into-ims > ignore-reload ignore-no-cache ignore-private ignore-auth > refresh_pattern -i \.swf$ 10080 80% 10080 override-expire > override-lastmod reload-into-ims ignore-reload ignore-no-cache > ignore-private ignore-auth refresh_pattern -i \.gif$ 10080 80% 10080 > override-expire override-lastmod reload-into-ims ignore-reload > ignore-no-cache ignore-private ignore-auth refresh_pattern -i \.jpg$ > 10080 80% 10080 override-expire override-lastmod reload-into-ims > ignore-reload ignore-no-cache ignore-private ignore-auth > refresh_pattern -i \.jpeg$ 10080 80% 10080 override-expire > override-lastmod reload-into-ims ignore-reload ignore-no-cache > ignore-private ignore-auth refresh_pattern -i \.exe$ 10080 80% 10080 > override-expire override-lastmod reload-into-ims ignore-reload > ignore-no-cache ignore-private ignore-auth > > # 1 year = 525600 mins, 1 month = 10080 mins, 1 day = 1440 refresh_pattern (get_video\?|videoplayback\?|videodownload\?|\.flv?) > 10080 80% 10080 ignore-no-cache ignore-private override-expire > override-lastmod reload-into-ims refresh_pattern > (get_video\?|videoplayback\?id|videoplayback.*id|videodownload\?|\.flv?) > 10080 80% 10080 ignore-no-cache ignore-private override-expire > override-lastmod reload-into-ims refresh_pattern \.(ico|video-stats) > 10080 80% 10080 override-expire ignore-reload ignore-no-cache > ignore-private ignore-auth override-lastmod negative-ttl=10080 > refresh_pattern \.etology\? 10080 > 80% 10080 override-expire ignore-reload ignore-no-cache > refresh_pattern galleries\.video(\?|sz) 10080 > 80% 10080 override-expire ignore-reload ignore-no-cache > refresh_pattern brazzers\? 10080 > 80% 10080 override-expire ignore-reload ignore-no-cache > refresh_pattern \.adtology\? 10080 > 80% 10080 override-expire ignore-reload ignore-no-cache > refresh_pattern > ^.*(utm\.gif|ads\?|rmxads\.com|ad\.z5x\.net|bh\.contextweb\.com|bstats\.adbrite\.com|a1\.interclick\.com|ad\.trafficmp\.com|ads\.cubics\.com|ad\.xtendmedia\.com|\.googlesyndication\.com|advertising\.com|yieldmanager|game-advertising\.com|pixel\.quantserve\.com|adperium\.com|doubleclick\.net|adserving\.cpxinteractive\.com|syndication\.com|media.fastclick.net).* > 10080 20% 10080 ignore-no-cache ignore-private override-expire > ignore-reload ignore-auth negative-ttl=40320 max-stale=10 > refresh_pattern ^.*safebrowsing.*google 10080 80% 10080 > override-expire ignore-reload ignore-no-cache ignore-private > ignore-auth negative-ttl=10080 refresh_pattern > ^http://((cbk|mt|khm|mlt)[0-9]?)\.google\.co(m|\.uk) 10080 80% > 10080 override-expire ignore-reload ignore-private negative-ttl=10080 > refresh_pattern ytimg\.com.*\.jpg > 10080 80% 10080 override-expire ignore-reload refresh_pattern > images\.friendster\.com.*\.(png|gif) 10080 80% > 10080 override-expire ignore-reload refresh_pattern garena\.com > 10080 80% 10080 override-expire reload-into-ims refresh_pattern > photobucket.*\.(jp(e?g|e|2)|tiff?|bmp|gif|png) 10080 80% > 10080 override-expire ignore-reload refresh_pattern > vid\.akm\.dailymotion\.com.*\.on2\? 10080 80% > 10080 ignore-no-cache override-expire override-lastmod refresh_pattern > mediafire.com\/images.*\.(jp(e?g|e|2)|tiff?|bmp|gif|png) 10080 80% > 10080 reload-into-ims override-expire ignore-private refresh_pattern > ^http:\/\/images|pics|thumbs[0-9]\. 10080 80% > 10080 reload-into-ims ignore-no-cache ignore-reload override-expire > refresh_pattern ^http:\/\/www.onemanga.com.*\/ > 10080 80% 10080 reload-into-ims ignore-no-cache ignore-reload > override-expire refresh_pattern > ^http://v\.okezone\.com/get_video\/([a-zA-Z0-9]) 10080 80% 10080 > override-expire ignore-reload ignore-no-cache ignore-private > ignore-auth override-lastmod negative-ttl=10080 > > #images facebook refresh_pattern -i \.facebook.com.*\.(jpg|png|gif) 10080 80% 10080 ignore-reload override-expire ignore-no-cache > refresh_pattern -i \.fbcdn.net.*\.(jpg|gif|png|swf|mp3) > 10080 80% 10080 ignore-reload override-expire ignore-no-cache > refresh_pattern static\.ak\.fbcdn\.net*\.(jpg|gif|png) > 10080 80% 10080 ignore-reload override-expire ignore-no-cache > refresh_pattern ^http:\/\/profile\.ak\.fbcdn.net*\.(jpg|gif|png) > 10080 80% 10080 ignore-reload override-expire ignore-no-cache > > #All File refresh_pattern -i \.(3gp|7z|ace|asx|bin|deb|divx|dvr-ms|ram|rpm|exe|inc|cab|qt) > 10080 80% 10080 ignore-no-cache override-expire override-lastmod > reload-into-ims refresh_pattern -i > \.(rar|jar|gz|tgz|bz2|iso|m1v|m2(v|p)|mo(d|v)|arj|lha|lzh|zip|tar) > 10080 80% 10080 ignore-no-cache override-expire override-lastmod > reload-into-ims refresh_pattern -i > \.(jp(e?g|e|2)|gif|pn[pg]|bm?|tiff?|ico|swf|dat|ad|txt|dll) > 10080 80% 10080 ignore-no-cache override-expire override-lastmod > reload-into-ims refresh_pattern -i > \.(avi|ac4|mp(e?g|a|e|1|2|3|4)|mk(a|v)|ms(i|u|p)|og(x|v|a|g)|rm|r(a|p)m|snd|vob) > 10080 80% 10080 ignore-no-cache override-expire override-lastmod > reload-into-ims refresh_pattern -i > \.(pp(t?x)|s|t)|pdf|rtf|wax|wm(a|v)|wmx|wpl|cb(r|z|t)|xl(s?x)|do(c?x)|flv|x-flv) > 10080 80% 10080 ignore-no-cache override-expire override-lastmod > reload-into-ims > > refresh_pattern -i (/cgi-bin/|\?) 0 0% 0 refresh_pattern ^gopher: > 1440 0% 1440 refresh_pattern ^ftp: 10080 95% 10080 > override-lastmod reload-into-ims refresh_pattern . 1440 > 95% 10080 override-lastmod reload-into-ims

    Read the article

  • Debugging PHP Mail() and/or PHPMailer

    - by Agos
    Hi, I'm quite stuck with a problem sending mail from a PHP script. Some data: Shared hosting, no SSH access, only hosting provider panel PHP version 5.2.5 Last year I built a site which had no problems sending mail with the same hosting Let's say the domain is “domain.com” and my private address is “[email protected]” for anonimity's sake in the following code. Here's the code: <?php error_reporting(E_ALL); ini_set("display_errors", 1); $to = "[email protected]"; $subject = "Hi"; $body = "Test 1\nTest 2\nTest 3"; $headers = 'From: [email protected]' . "\r\n" . 'errors-to: [email protected]' . "\r\n" . 'X-Mailer: PHP/' . phpversion(); if (mail($to, $subject, $body, $headers)) { echo("Message successfully sent"); } else { echo("Message sending failed"); } require('class.phpmailer.php'); $message = "Hello world"; $mail = new PHPMailer(); $mail->CharSet = "UTF-8"; $mail->AddAddress("[email protected]", "Agos"); $mail->SetFrom("[email protected]","My Site"); $mail->Subject = "Test Message"; $mail->Body = $message; $mail->Send(); ?> And here is what I get: Message sending failed 'ai' = 'application/postscript', 'eps' = 'application/postscript', 'ps' = 'application/postscript', 'smi' = 'application/smil', 'smil' = 'application/smil', 'mif' = 'application/vnd.mif', 'xls' = 'application/vnd.ms-excel', 'ppt' = 'application/vnd.ms-powerpoint', 'wbxml' = 'application/vnd.wap.wbxml', 'wmlc' = 'application/vnd.wap.wmlc', 'dcr' = 'application/x-director', 'dir' = 'application/x-director', 'dxr' = 'application/x-director', 'dvi' = 'application/x-dvi', 'gtar' = 'application/x-gtar', 'php' = 'application/x-httpd-php', 'php4' = 'application/x-httpd-php', 'php3' = 'application/x-httpd-php', 'phtml' = 'application/x-httpd-php', 'phps' = 'application/x-httpd-php-source', 'js' = 'application/x-javascript', 'swf' = 'application/x-shockwave-flash', 'sit' = 'application/x-stuffit', 'tar' = 'application/x-tar', 'tgz' = 'application/x-tar', 'xhtml' = 'application/xhtml+xml', 'xht' = 'application/xhtml+xml', 'zip' = 'application/zip', 'mid' = 'audio/midi', 'midi' = 'audio/midi', 'mpga' = 'audio/mpeg', 'mp2' = 'audio/mpeg', 'mp3' = 'audio/mpeg', 'aif' = 'audio/x-aiff', 'aiff' = 'audio/x-aiff', 'aifc' = 'audio/x-aiff', 'ram' = 'audio/x-pn-realaudio', 'rm' = 'audio/x-pn-realaudio', 'rpm' = 'audio/x-pn-realaudio-plugin', 'ra' = 'audio/x-realaudio', 'rv' = 'video/vnd.rn-realvideo', 'wav' = 'audio/x-wav', 'bmp' = 'image/bmp', 'gif' = 'image/gif', 'jpeg' = 'image/jpeg', 'jpg' = 'image/jpeg', 'jpe' = 'image/jpeg', 'png' = 'image/png', 'tiff' = 'image/tiff', 'tif' = 'image/tiff', 'css' = 'text/css', 'html' = 'text/html', 'htm' = 'text/html', 'shtml' = 'text/html', 'txt' = 'text/plain', 'text' = 'text/plain', 'log' = 'text/plain', 'rtx' = 'text/richtext', 'rtf' = 'text/rtf', 'xml' = 'text/xml', 'xsl' = 'text/xml', 'mpeg' = 'video/mpeg', 'mpg' = 'video/mpeg', 'mpe' = 'video/mpeg', 'qt' = 'video/quicktime', 'mov' = 'video/quicktime', 'avi' = 'video/x-msvideo', 'movie' = 'video/x-sgi-movie', 'doc' = 'application/msword', 'word' = 'application/msword', 'xl' = 'application/excel', 'eml' = 'message/rfc822' ); return (!isset($mimes[strtolower($ext)])) ? 'application/octet-stream' : $mimes[strtolower($ext)]; } /** * Set (or reset) Class Objects (variables) * * Usage Example: * $page-set('X-Priority', '3'); * * @access public * @param string $name Parameter Name * @param mixed $value Parameter Value * NOTE: will not work with arrays, there are no arrays to set/reset * @todo Should this not be using __set() magic function? */ public function set($name, $value = '') { try { if (isset($this-$name) ) { $this-$name = $value; } else { throw new phpmailerException($this-Lang('variable_set') . $name, self::STOP_CRITICAL); } } catch (Exception $e) { $this-SetError($e-getMessage()); if ($e-getCode() == self::STOP_CRITICAL) { return false; } } return true; } /** * Strips newlines to prevent header injection. * @access public * @param string $str String * @return string */ public function SecureHeader($str) { $str = str_replace("\r", '', $str); $str = str_replace("\n", '', $str); return trim($str); } /** * Set the private key file and password to sign the message. * * @access public * @param string $key_filename Parameter File Name * @param string $key_pass Password for private key */ public function Sign($cert_filename, $key_filename, $key_pass) { $this-sign_cert_file = $cert_filename; $this-sign_key_file = $key_filename; $this-sign_key_pass = $key_pass; } /** * Set the private key file and password to sign the message. * * @access public * @param string $key_filename Parameter File Name * @param string $key_pass Password for private key */ public function DKIM_QP($txt) { $tmp=""; $line=""; for ($i=0;$i<= $ord) && ($ord <= 0x3A)) || $ord == 0x3C || ((0x3E <= $ord) && ($ord <= 0x7E)) ) { $line.=$txt[$i]; } else { $line.="=".sprintf("%02X",$ord); } } return $line; } /** * Generate DKIM signature * * @access public * @param string $s Header */ public function DKIM_Sign($s) { $privKeyStr = file_get_contents($this-DKIM_private); if ($this-DKIM_passphrase!='') { $privKey = openssl_pkey_get_private($privKeyStr,$this-DKIM_passphrase); } else { $privKey = $privKeyStr; } if (openssl_sign($s, $signature, $privKey)) { return base64_encode($signature); } } /** * Generate DKIM Canonicalization Header * * @access public * @param string $s Header */ public function DKIM_HeaderC($s) { $s=preg_replace("/\r\n\s+/"," ",$s); $lines=explode("\r\n",$s); foreach ($lines as $key=$line) { list($heading,$value)=explode(":",$line,2); $heading=strtolower($heading); $value=preg_replace("/\s+/"," ",$value) ; // Compress useless spaces $lines[$key]=$heading.":".trim($value) ; // Don't forget to remove WSP around the value } $s=implode("\r\n",$lines); return $s; } /** * Generate DKIM Canonicalization Body * * @access public * @param string $body Message Body */ public function DKIM_BodyC($body) { if ($body == '') return "\r\n"; // stabilize line endings $body=str_replace("\r\n","\n",$body); $body=str_replace("\n","\r\n",$body); // END stabilize line endings while (substr($body,strlen($body)-4,4) == "\r\n\r\n") { $body=substr($body,0,strlen($body)-2); } return $body; } /** * Create the DKIM header, body, as new header * * @access public * @param string $headers_line Header lines * @param string $subject Subject * @param string $body Body */ public function DKIM_Add($headers_line,$subject,$body) { $DKIMsignatureType = 'rsa-sha1'; // Signature & hash algorithms $DKIMcanonicalization = 'relaxed/simple'; // Canonicalization of header/body $DKIMquery = 'dns/txt'; // Query method $DKIMtime = time() ; // Signature Timestamp = seconds since 00:00:00 - Jan 1, 1970 (UTC time zone) $subject_header = "Subject: $subject"; $headers = explode("\r\n",$headers_line); foreach($headers as $header) { if (strpos($header,'From:') === 0) { $from_header=$header; } elseif (strpos($header,'To:') === 0) { $to_header=$header; } } $from = str_replace('|','=7C',$this-DKIM_QP($from_header)); $to = str_replace('|','=7C',$this-DKIM_QP($to_header)); $subject = str_replace('|','=7C',$this-DKIM_QP($subject_header)) ; // Copied header fields (dkim-quoted-printable $body = $this-DKIM_BodyC($body); $DKIMlen = strlen($body) ; // Length of body $DKIMb64 = base64_encode(pack("H*", sha1($body))) ; // Base64 of packed binary SHA-1 hash of body $ident = ($this-DKIM_identity == '')? '' : " i=" . $this-DKIM_identity . ";"; $dkimhdrs = "DKIM-Signature: v=1; a=" . $DKIMsignatureType . "; q=" . $DKIMquery . "; l=" . $DKIMlen . "; s=" . $this-DKIM_selector . ";\r\n". "\tt=" . $DKIMtime . "; c=" . $DKIMcanonicalization . ";\r\n". "\th=From:To:Subject;\r\n". "\td=" . $this-DKIM_domain . ";" . $ident . "\r\n". "\tz=$from\r\n". "\t|$to\r\n". "\t|$subject;\r\n". "\tbh=" . $DKIMb64 . ";\r\n". "\tb="; $toSign = $this-DKIM_HeaderC($from_header . "\r\n" . $to_header . "\r\n" . $subject_header . "\r\n" . $dkimhdrs); $signed = $this-DKIM_Sign($toSign); return "X-PHPMAILER-DKIM: phpmailer.worxware.com\r\n".$dkimhdrs.$signed."\r\n"; } protected function doCallback($isSent,$to,$cc,$bcc,$subject,$body) { if (!empty($this-action_function) && function_exists($this-action_function)) { $params = array($isSent,$to,$cc,$bcc,$subject,$body); call_user_func_array($this-action_function,$params); } } } class phpmailerException extends Exception { public function errorMessage() { $errorMsg = '' . $this-getMessage() . " \n"; return $errorMsg; } } ? Fatal error: Class 'PHPMailer' not found in /mailtest.php on line 20 Which is baffling to say the least. Is there anything I can do to get at least some more meaningful errors? Why is code from the class showing up in my file?

    Read the article

< Previous Page | 1 2 3 4