Friday, May 6, 2011

Break the PDF document after 100 pages

I am working with Jasper and iReport tool. One of the requirements the client wants is that the PDF file will be generated to a 100 page document only.

Could you please help me? How can I generated the 100 page PDF document?

From stackoverflow
  • In the iReport you can find a built in variable PAGE_COUNT. For every element in the detail band you can put the following in the "Print when expression" textbox:

    Boolean.valueOf($V{PAGE_COUNT}.intValue() < 100)
    

    This will stop printing after page number 100.

IE WebControls TreeView - web application javascript problem

I am using the old Microsoft.Web.UI.WebControls TreeView control. When running under a web application the treeview.GetChildren() method through an 'object or proerty not supported error', yet the same code in an asp.net 2.0 website project runs fine.

Has anyone encountered this issue? There is almost nothing on the web about this control...

thanks for any help.

From stackoverflow
  • Because the web controls completely suck dude. Use javascript! Not that horrible non-standard-compliant stuff...

  • santiycr you realy should take some time to learn about web technologies before writing stupid comments.

Reflection performance for Data Access Layer

Hi all,
I had created a framework for a project in the past that one of it’s functionalities was to load the Database info into my business entity classes (only properties no methods) and from the Business entity classes to the database loading the parameters collection of the stored procedure to be executed. To this on that project I decorated the Business Entity Classes with the DB Filed info and SP Parameters like the sample below and let the framework load the entity or the Parameters collection using reflection so I didn’t had to generate new code for the maintenances.
But now I am creating a new and much bigger project, of course much more code to maintain, but where performance is critical and was wondering if it’s worth using reflection for all the load and kept the code much simpler or actually generate all the code and maintain all the changes?
I had did some search, read some of the documentation on MSDN, but still found lots of different opinions, people that liked reflection showing numbers that the overhead is not that bad, and others saying that it’s actually better to keep away from reflection

Technical specs for the new app:
Language: C#
.Net Version: 3.5
Application Type: Classic Web Forms accessing Logic components and Data Access Tier also in C#
Database: SQL Server 2008
Database Abstraction Layer: All access to the DB is made via Stored Procedures and User Defined Functions.


Sample Code:

    // Decorated class
[System.Serializable()]
public class bMyBusinessEntity{
    private Int64 _MyEntityID;
    private string _MyEntityName;
    private string _MyEntityDescription;

    [aFieldDataSource(DataColumn = "MyEntityID")]
    [aRequiredField(ErrorMessage = "The field My Entity ID is mandatory!")]
    [aFieldSPParameter(ParameterName="MyEntityID")]
    public Int64 MyEntityID{
        get { return _MyEntityID; }
        set { _MyEntityID = value; }
    }

    [aFieldDataSource(DataColumn = "MyEntityName")]
    [aFieldSPParameter(ParameterName = "MyEntityName")]
    public  string MyEntityName{
        get { return _MyEntityName; }
        set { _MyEntityName = value; }
    }
    [aFieldDataSource(DataColumn = "MyEntityDescription")]
    [aFieldSPParameter(ParameterName = "MyEntityDescription")]
    public string MyEntityDescription{
        get { return _MyEntityDescription; }
        set { _MyEntityDescription = value; }
    }
}


   // To Load from DB to the Object:
   using (DataTable dtblMyEntities = objDataSource.ExecuteProcedure(strSPName, objParams)) {
       if (dtblMyEntities.Rows.Count > 0) {
           DataRow drw = dtblMyEntities.Rows[0];
           oFieldDataSource.LoadInfo(ref objMyEntity, drw);
           return objMyEntity;
       }
       else
           throw new Exception(“Row not found!”);
  }

  // To Load from the Object to the DB
  oDataSource objDataSource = new oDataSource();
  IDbDataParameter[] objParams = objDataSource.GetProcedureParameters(strSPName);
  oFieldSPParameter.LoadInfo(objParams, objMyEntity);
  objDataSource.ExecuteNonQuery(strSPName, objParams);
From stackoverflow
  • Rather than rolling what is basically your own ORM, I would recommend switching to one of the established ORMs such as NHibernate or Entity Framework.

    To answer your question directly, reflection performance isn't that bad, but I'd personally never think of using an ORM I rolled myself on a large project.

  • Personally, I wouldn't use Reflection if the data access requirements call for a large number of transactions (a highly transactional system) - what you gain in flexibility ultimately costs you at runtime (more comment on reflection here).

    I'd pick a popular ORM solution in deference to a custom solution. Mainly you will benefit from a larger community of people using the same approaches (easier to get design advice, debug and also take advantage of known performance tweaks).

    It also usually means access to updates which support newer technology (e.g. SQL Server 2008) as it is released - you don't wear that burdon, or the cost of testing (other than straight implementation).

    There are a number of popular solutions including the Entity Framework and LINQ to SQL (in .Net 3.5 and both support Stored Procs) but also a great deal of support for a template-driven approach using CodeSmith templates/Net Tiers, or more complicated solutions using NHibernate or Deklarit, for example.

    The last big solution I was involved with used Stored Procedures and Functions in much the same way you have described, however we used the Enterprise Library and generated DAL access classes and data transfer objects using a handwritten tool. You could use much the same approach as used in MS Patterns and Practices 'Web Service Software Factory' potentially, or any template-driven approach.